Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Speed and Memory Usage #42

Open
ghost opened this issue Mar 16, 2019 · 3 comments
Open

Speed and Memory Usage #42

ghost opened this issue Mar 16, 2019 · 3 comments

Comments

@ghost
Copy link

ghost commented Mar 16, 2019

Large Json files (1+ GB) use up 15+ GB of memory and take 4+ mins to parse to get to that point before it crashes for me with out of memory.

@wilzbach
Copy link
Member

@ghost ghost added the enhancement label Mar 17, 2019
@s-ludwig
Copy link
Collaborator

Which method of parsing do you use? The DOM parser or one of the range based ones? For the former, I'd kind of expect this to happen, depending on the structure of the JSON document, because arrays and associative-array have a huge overhead of few items are stored in them.

Arrays could be optimized in that regard by using a pointer-bump allocation strategy, but for AAs there is not much that can be done, other than not using them.

@ghost
Copy link
Author

ghost commented Mar 17, 2019

It is with the DOM parser, to be fair though a lot of Json libraries I've tried have had excessive memory usage with these large files. It has something like 50 million objects in it, so it'd be understandable when you take into account it has to store all the keys and values with it as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants