-
Notifications
You must be signed in to change notification settings - Fork 286
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v8 heapTotal keeps rising when heapUsed does not #1484
Comments
According to this, the difference between |
Possibly related to nodejs/node#22229 |
That would typically lead to RSS rising, but not |
@Hakerh400 Thanks for the insight, it seems that the fix for that memory leak landed in However I did actually test the app before with Node @addaleax Thanks for the insight as well, good point. Any idea what would be causing |
@eladnava I think that would be visible in a heap snapshot, if you can generate + inspect one? |
@Hakerh400 I removed all invocations of Seems like @addaleax I'll generate a heap snapshot soon. |
@addaleax Here is a heap snapshot for the following
This was taken after removing all traffic from the server (right after generating the graph on my last post), so that no clients were connected but the leak persists as something in the Some allocation is lingering in the |
It seems the symptom is now |
It seems that the
I am not using any native C libraries or dependencies. Therefore I am completely at a loss as to what is occupying memory in Any help will be greatly appreciated. |
I'm also considering the option that there is no longer a leak but the OS prefers not to reclaim |
@eladnava You should consider that RSS is actually not an intrinsic measure of your code, but its also competitive with whatever else is in your workload at the time - that is, all other processing, including the OS. If you have no other significant workload, then your Node.js processing will be the most active foreground job running, and it will win this competition (and RSS will rise) and dominate for access to physical storage to support processing; if you have some other activity, then there is some competition that will cause (least-used usually) pages of memory to be paged-out in favour of the other workload. This is perfectly normal. Indeed, with a mixed workload of activity, all things being equal, RSS can still vary! This is what RSS (Resident Set Size) is all about... its the portion (set) of the allocated memory of our process that is currently resident in physical memory based on current activity relative to current workload.. |
@shellberg Thanks for this insight! That definitely appears to be the case. It seems that I have managed to fix the issue! Thing is I don't know exactly what caused it but I changed 10 things at once and looks like memory consumption is stable now: At last, Thank you all @shellberg @addaleax @Hakerh400 for your help! It is greatly appreciated. 😄 |
@eladnava |
@vaibhavi3t Unfortunately I don't remember as it was too long ago, but all I can recommend is trying to tinker with the settings mentioned in this issue to see if it has any effect. |
@eladnava |
When running my app with a
--max-old-space-size=16384
theheapTotal
keeps rising although theheapUsed
remains pretty much the same:global.gc()
is run every 3 minutes precisely. The machine itself has exactly 16GB of RAM. Eventually this behavior leads to an OOM andnode
is killed.Now originally I thought this was a memory leak but since
heapUsed
remains constant it can't be, can it? Is it a native memory leak possibly? Or just theheapTotal
growing uncontrollably due to some bug in node core?Interestingly as well, the gap between
rss
andheapTotal
keeps growing over time, due to some native memory leak outsidev8
heap?Any pointers as to what's going on would be super helpful. Unfortunately I cannot provide code to reproduce currently, but I can say that the load on the server and number of concurrent connections remains the same throughout the test.
I'm also considering testing with a lower
--max-old-space-size
to see if it forcesnode
to keep theheapTotal
a certain size.Thanks!
The text was updated successfully, but these errors were encountered: