-
Notifications
You must be signed in to change notification settings - Fork 465
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fastest way to create a huge object with NAPI #1219
Comments
Hi @ggreco, |
@ggreco do you need some other informations or is it possible to close the issue? |
Not really, my point is to be able to build a huge javascript object inside the native code with a decent performance. At the moment it takes several seconds to create one in the worst case scenario (a 800MB json file), as my example code shows. I hoped NAPI included some sort of "batch" creator that could create a full structure without having to create the nodes one by one. The JSON parse was just an example, I have a huge dataset inside my native addon (so streaming is not the issue), and I have to pass pieces of it to the javascript world. It seems that NAPI is so slow at creating object that passing a buffer containing a binary "packed" object to javascript and parsing it within node using code similar to this one: buffer.readInt16BE(offset);
offset+=2;
buffer.readInt8(offset);
offset+=1;
buffer.readDoubleBE(offset);
offset+=8; ... is faster that creating the data structure from inside the native module! Anyway given the original problem the fastest json streamer I could find in the npm registry is 10 times slower that: JSON.parse(fs.readFileSync("my.json")); |
Essentially is what I'm doing with the node Buffer.... my hope was I was missing some napi utility for batch creation, but I also reviewed the C api and there is nothing for my use case... so probably you can close it, I can apply for a feature request against the node C api. If you know in advance, for instance, that you need to create an array of 1000000 of objects it will be nice to have an API to create them, instead of having to do (pseudo code in c++, not 100% sure about syntax): auto a = napi::Array::New(env, 1000000);
for (size_t i = 0; i < 1000000; ++i)
a[i] = napi::Object::New(env); |
This issue is stale because it has been open many days with no activity. It will be closed soon unless the stale label is removed or a comment is made. |
Discussion in the Node-api team meeting was that we should close this issue in deference to the discussion going on in node core - nodejs/node#45905. Once that completes then we can re-open this issue or create a new one to cover the node-addon-api side. |
This is not strictly an issue, more a performance issue... point me to a better location where to post this, if it's not the right place... I could not find one.
I need to parse huge JSONs in my node application, and node has a limit of 512MB on the string that can be used as input of a JSON.parse(), so I tried to implement a JSON parse on a file (or a remote buffer) using NAPI, but the result is quite slow, more than 10 times slower of JSON.parse() on JSON sizes that JSON.parse can handle, I'm using rapidjson to parse in C++, and this is pretty fast, benchmarking the code the bottleneck seems to be the allocation of the napi values I'm creating when parsing the JSON, so I'm asking if there is something I can do to speed up the allocation of a lot of small objects (something like a bulk allocation method or a sql-like begin/end transaction logic for operations like this one), here is the c++ code I wrote:
The text was updated successfully, but these errors were encountered: