-
Notifications
You must be signed in to change notification settings - Fork 30.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v8.writeHeapSnapshot() exceeds v8 string length so snapshot can not be imported in google chrome #50952
Comments
If the heap snapshot can be generated by Node.js just fine, the bug is in the Chromium devtools. Please open an issue in the Chromium issue tracker instead https://bugs.chromium.org/p/chromium/issues |
@joyeecheung fyi chromium team has acknowledged the problem but says chromium is not made to support these kind of snapshots. So basically this will remain unsolved and is "kind of" a nodejs TODO, still. https://bugs.chromium.org/p/chromium/issues/detail?id=1505989#c10 |
I don’t think there is much to do on Node.js side, it’s also up to the user what tool they want to use to analyze the snapshots. Chromium’s DevTools is just one of them. It’s no more a Node.js support issue than e.g. requesting VSCode to display a large snapshot better. AFAIK some companies develop their own proprietary tools to analyze big heap snapshots too. And Node.js isn’t the only runtime that allows users to generate a big V8 snapshot either so e.g. this could be an issue for Deno as well. |
i do not want to be an asshole here, since i do not have the sufficient skills nor time to implement this myself, but: Taken from https://v8.dev/ itself, chrome and node are both mentioned as the primary uses of the v8 engine, and both do not support analysing snapshots of (i would say) a size to be expected when you need a snapshot for RAM usage profiling. So you expect me (or any user) to just give up and deal with it? Because the biggest commercial "user" of said engine and the biggest non commercial "user" of said engine don't even bother implementing it? If so, why don't we just delete all of the code that creates a snapshot anyway? .... Deno does not provide any kind of useful snapshot analysing (interactive). ouch! |
I am not sure which one is being described as “commercial” here, but just a reminder, Node.js is not owned by a company. It’s a community driven project even more than Chromium or V8. I do know some company that need to analyze big heap snapshots do develop their own tools though they don’t seem to open source it or make it available for free. One of the case I know, they are piggybacking on a Java tool, and it seems on Java’s side one also need to go through some hoops to find the right tool to analyze a huge heap - though on the Java’s side, the threshold of the size of the heap that makes it difficult is higher, and there are a lot more commercial (non-free) tooling/runtime that companies develop their business on. So to me that seems to be the economics of open source software in play. There might be some other tools available for free out there for this use case that I am not aware of though. You can try asking in https://github.com/nodejs/diagnostics/issues. The issue tracker of Node.js core is not the right place to air your grievances because whatever solution there would be it’s unlikely to be developed in this code base. |
Version
v18.18.2
Platform
Microsoft Windows NT 10.0.19044.0 x64
Subsystem
No response
What steps will reproduce the bug?
v8.writeHeapSnapshot() of a ~ 2.5GB RAM nodejs application
when the writeHeapSnapshot() starts, it uses all of available system memory (in my case up to 22 GB), then generates a heap snapshop file with size 4.12 GB
Then i import this file into google chrome memory debugging (chrome 119.0.6045.160 64bit)
How often does it reproduce? Is there a required condition?
reproduces all of the time (with large heap snapshots)
What is the expected behavior? Why is that the expected behavior?
No response
What do you see instead?
Additional information
related to: #35973 (comment)
The text was updated successfully, but these errors were encountered: