Add configurable processing limits for JSON generator (StreamWriteConstraints
)
#1048
Labels
2.16
Issue planned (at earliest) for 2.16
processing-limits
Issues related to limiting aspects of input/output that can be processed without exception
Similar to #637, but for generation side, we may want to limit amount of processing based on some criteria.
Compared to input side, constraining generation may be less important for DoS reasons, but there are some aspects that seem like they would benefit from having limits.
First: limiting maximum nesting depth (defaulting to, say, 1000 levels). While this may not be an easy DoS attack vector, it is an accidental "own goal" case where a (relatively) common case where attempts to serialize cyclic data structures may result in
StackOverflowError
. While there are possible approaches to preventing this using other mechanisms, capping maximum nesting would be a straight-forward an efficient way to avoid SOE and resulting major resource drainage: instead of having to maintain a partial object graph to look for "back links", we simply keep track of nesting level. This can be configured to value that is high enough not to block typical legit sage, but prevent recursion by serializers to level well before SOE.Other possible later additions could include:
JsonWriteContext
anyway so could be relatively simple (implemented via Add StreamWriteConstraints with a nesting depth check #1055)but these are just speculative ones, not requested at this point.
As to possible implementation: this should follow pattern established with #637 adding
StreamWriteConstraints
, starting with the first implementation. We probably should allow something similar to #1019 immediately as well (wrt static default override).The text was updated successfully, but these errors were encountered: