You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Which method of parsing do you use? The DOM parser or one of the range based ones? For the former, I'd kind of expect this to happen, depending on the structure of the JSON document, because arrays and associative-array have a huge overhead of few items are stored in them.
Arrays could be optimized in that regard by using a pointer-bump allocation strategy, but for AAs there is not much that can be done, other than not using them.
It is with the DOM parser, to be fair though a lot of Json libraries I've tried have had excessive memory usage with these large files. It has something like 50 million objects in it, so it'd be understandable when you take into account it has to store all the keys and values with it as well.
Large Json files (1+ GB) use up 15+ GB of memory and take 4+ mins to parse to get to that point before it crashes for me with out of memory.
The text was updated successfully, but these errors were encountered: