-
-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Denial of service when parsing a JSON object with an unexpected field that has a big number #187
Comments
Hi @plokhotnyuk, is |
@marcospereira those are different cases: |
Hi @plokhotnyuk, Thanks for the additional information. I was able to reproduce the problem using |
Hum, actually, I think we can impose some limits here: play-json/play-json/jvm/src/main/scala/play/api/libs/json/jackson/JacksonJson.scala Line 144 in 327bfd6
Get the text value is a (way) cheaper operation. We still need to load all data to memory (not sure if Jackson offers another way to check the value length), but we can check if the number of "digits" is beyond an acceptable limit (I see jsoniter is using these limits by default). It looks like a workaround to me, but at least we can move forward without entirely depending on Jackson. WDYT? |
@marcospereira IMHO patches for #180, #186, and #187 issues of Play-JSON should be released without waiting for patched Jackson, Scala or Java libraries Also, we should avoid returning of parsed Just try how this code work with different Scala versions:
or
Most users (which usually do their financial calculations with big decimals) do not aware that such pure functions have a side effect that cannot be easily ignored on contemporary hardware. |
## Fixes Fixes #187 ## Purpose Parsing large big decimals (thing tens of hundred digits) and operating on these numbers can be very CPU demanding. While play-json currently supports handling large numbers, it is not practical on real-world applications and can expose them to DoS of service attacks. This changes the way parsing happens to limit the size of such numbers based on MathContext.DECIMAL128.
* Avoid parsing large big decimals (#200) * Avoid parsing large big decimals Parsing large big decimals (thing tens of hundred digits) and operating on these numbers can be very CPU demanding. While play-json currently supports handling large numbers, it is not practical on real-world applications and can expose them to DoS of service attacks. This changes the way parsing happens to limit the size of such numbers based on MathContext.DECIMAL128. * Format details * Fix typo * Remove tests duplication * Add breadcrumbs detailing where precision is defined * Improve parsing readability * Improve test readability * Make it possible to configure the parsing for large big decimals (#191) Fixes #187 Parsing large big decimals (thing tens of hundred digits) and operating on these numbers can be very CPU demanding. While play-json currently supports handling large numbers, it is not practical on real-world applications and can expose them to DoS of service attacks. This changes the way parsing happens to limit the size of such numbers based on MathContext.DECIMAL128. * Fix binary compatibility issues * Codec for BigInt (#122) * Codec for BigInt * MiMa * More tests * Add small comment about bincompat filter * Fix Scala 2.10 compatibility issue
Play JSON Version (2.5.x / etc)
2.7.0-M1
API (Scala / Java / Neither / Both)
Scala 2.12.7
Operating System (Ubuntu 15.10 / MacOS 10.10 / Windows 10)
Ubuntu 16.04
JDK (Oracle 1.8.0_72, OpenJDK 1.8.x, Azul Zing)
Oracle JDK 11
Library Dependencies
none
Expected Behavior
Sub-linear decreasing of throughput when length of the JSON object is increasing
Actual Behavior
Sub-quadratic decreasing of throughput when length of the JSON object is increasing
On contemporary CPUs parsing of such JSON object with an additional field that has of 1000000 decimal digits (~1Mb) can took more than 13 seconds:
Reproducible Test Case
To run that benchmarks on your JDK:
sbt
and/or ensure that it already installed properly:jsoniter-scala
repo:The text was updated successfully, but these errors were encountered: