Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sparse Arrays #15

Open
rodneyrehm opened this issue Feb 26, 2014 · 4 comments
Open

Sparse Arrays #15

rodneyrehm opened this issue Feb 26, 2014 · 4 comments

Comments

@rodneyrehm
Copy link

<input name="mix[0]" value="alpha">
<input name="mix[5]" value="bravo">
<!--
  { mix: ["alpha", null, null, null, null, "bravo"] }
  mix[0]=alpha&mix[5]=bravo
-->

This has been pointed out before: This is not only "ugly", it's a trivial DoS waiting to happen.

While this was trying to be developer friendly, a simple "fix" can be found in PHP's json_encode: sequential vs. non-sequential array example. If the keys of a map do not fulfill the following condition, the map is not converted to array, but serialized to object:

var data = {"0": "alpha", "1": "bravo", "2": "charlie"};
// all indexes must be integer
var _notInteger = /[^0-9]/
var _invalidKeys = Object.keys(data).some(_notInteger.test, _notInteger);
var keys = Object.keys(data).map(Number);
var _invalidKeys = keys.some(function(value){ return isNaN(value) });
// lowest index must be 0
var _lowerBound = Math.min.apply(Math, keys) !== 0;
// highest index must be exactly 
var _upperBound = Math.max.apply(Math, keys) !== keys.length - 1;

if (_invalidKeys || _lowerBound || _upperBound) {
  // serialize to Object
} else {
  // serialize to Array
}
@darobin
Copy link
Owner

darobin commented Mar 3, 2014

The problem is that there are legitimate reasons for sending (moderately) sparse arrays. I really don't want to remove those.

I would think it reasonable for UAs to impose a limit on the number of null fields in an array. WDYT?

@macek
Copy link

macek commented Jul 14, 2014

@darobin I'm going to agree with your latest comment. It's certainly worth some discussion, but I think denying a (moderately) sparse array simply because they could be misused is not an end-all, be-all reason not to support them.

@emilv
Copy link

emilv commented Nov 27, 2014

This is not so much a UA issue as it is an issue with the server-side conversion algorithm for supporting older clients. Someone can send in a relatively small payload that blows up in server-side memory.

Enforcing a limit on null values might be surprising to a lot of developers.

We should specify this limit, either as a specific upper bound or at least by acknowledging that there might be limits and what should happen if so. Otherwise we will quickly get a lot of different behaviours, which kind of defeats the point of a standard.

I propose that we address this issue by:

  1. Mentioning that there might be such a limit, and
  2. Saying how to enforce this limit in the conversion algorithm

@Boldewyn
Copy link
Contributor

Apparently, browsers do not yet (as of 2016-03) feature a security guard against massive sparse arrays: http://codingsight.com/how-to-hang-chrome-firefox-and-node-js-inside-native-func/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants