-
Notifications
You must be signed in to change notification settings - Fork 18k
encoding/json: provide a way to limit recursion depth #31789
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Sounds like a reasonable json.Decoder option. We should also set the default to something reasonable. |
I agree that this is a good idea; encoding/gob has similar checks to avoid runtime panics. |
is anyone working on this? If not, I'd like to take this up. |
I believe @joshblakeley intends to work on it soon. There's no rush, as we're in the middle of a freeze right now. |
Change https://golang.org/cl/190658 mentions this issue: |
I have at least two concerns this:
|
Thanks for comments!
I like your idea with Besides that, i see #33036 has still tags
Are you suggesting just do not change current behavior of |
For JSON, bounding the input size is a relatively straightforward way to limit heap and cpu consumption (a local benchmark shows 1MB of JSON triggering ~40MB of allocs and taking ~0.05 seconds, and 10MB of JSON triggering ~500MB of allocs and taking ~.5 seconds). The default stack gets exhausted much more quickly than memory or CPU on a robust server, and the failure mode is much more severe.
I agree. Rather than an API change, I think a generous backstop should be added internally (e.g. a max depth of 10,000). As noted above, this is permitted by the RFC ("...An implementation may set limits on the maximum depth of nesting.") That still allows callers to limit input size to bound allocs/cpu without needing to introspect the JSON for stack overflow attacks. |
Change https://golang.org/cl/199837 mentions this issue: |
Opened https://go-review.googlesource.com/c/go/+/199837 implementing this as an alternative |
@liggitt's patch looks good to me, so I'm going to merge it when the tree reopens for 1.15. If anyone has reservations about the design or wants to give a review, you have about six weeks to do that :) |
What version of Go are you using (
go version
)?What did you do?
A JSON string of about 17MB, constructed by the linked code snippet, is enough to cause a stack overflow with the default 1GB stack limit. A different construction can produce this behavior using only about a 5MB string.
https://play.golang.org/p/MLn5clWEIOl
What did you expect to see?
Applications that use encoding/json to parse API calls can limit their maximum request size to avoid this resource exhaustion attack. But, this limit may be too small for some applications, and the amount of stack used can still be a problem even if the application does not crash. It would be desirable if the module had some way of enforcing a maximum array or object nesting depth during parsing, so that applications using encoding/json can continue to accept large API
calls.
I'm mainly familiar with C++ implementations that do that, for example ripple:
https://github.com/ripple/rippled/blob/5214b3c1b09749420ed0682a2e49fbbd759741e5/src/ripple/protocol/impl/STParsedJSON.cpp#L730
Or bitshares:
https://github.com/bitshares/bitshares-fc/blob/8ebd99b786623bc8d55e89d42df82644a71a6885/src/io/json.cpp#L415
This C++ library for JSON does not (explicitly, although it notes the RFC allows limiting the nesting depth) but it does provide a callback by which the depth could be checked and probably used to abort parsing:
https://github.com/nlohmann/json/blob/ee4028b8e4cf6a85002a296a461534e2380f0d57/include/nlohmann/json.hpp#L1078
This Rust implementation does limit the nesting depth but has a flag to disable such checking:
https://github.com/serde-rs/json/blob/5b5f95831d9e0d769367b30b76a686339bffd209/src/de.rs#L202
IBM's DataPower Gateways documents nesting limits in their implementation:
https://www.ibm.com/support/knowledgecenter/en/SS9H2Y_7.6.0/com.ibm.dp.doc/json_parserlimits.html
The text was updated successfully, but these errors were encountered: