Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak when using asynch #501

Closed
oferh opened this issue Oct 26, 2015 · 5 comments
Closed

Memory leak when using asynch #501

oferh opened this issue Oct 26, 2015 · 5 comments

Comments

@oferh
Copy link

oferh commented Oct 26, 2015

I've created a minimal asynchronous addon based on async_pi_estimate and ran it in a loop. It appears that there is a memory leak when using it because the code itself doesn't allocate dynamic memory but after running 200K iterations the memory footprint is as follows:
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
29684 ofer 20 0 813.3m 161.8m 7.1m S 0.0 2.0 0:03.29 node

The code is available in this gist

Node version: 0.12.7

@kkoopa
Copy link
Collaborator

kkoopa commented Oct 26, 2015

Nothing there indicates a leak.

On October 26, 2015 11:19:37 AM GMT+02:00, Ofer Herman notifications@github.com wrote:

I've created a minimal asynchronous addon based on async_pi_estimate
and ran it in a loop. It appears that there is a memory leak when using
it because the code itself doesn't allocate dynamic memory but after
running 200K iterations the memory footprint is as follows:
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+
COMMAND

29684 ofer 20 0 813.3m 161.8m 7.1m S 0.0 2.0 0:03.29
node

The code is available in this
gist


Reply to this email directly or view it on GitHub:
#501

@oferh
Copy link
Author

oferh commented Oct 26, 2015

It looks as if there is no reuse of resources it's strange that the RES memory reaches 150M for it, but it might be my misunderstanding of how node/V8 allocates memory.

Changed the test to make 2 iterations of 2K calls each and the memory consumption goes higher between each iteration.

var addon = require('../');

for (var i=0; i<2000; i++){
  addon.asynctest(function(){});
}
global.gc();

for (var i=0; i<2000; i++){
  addon.asynctest(function(){});
}

global.gc();

This is the output after the first iteration:

==28951== LEAK SUMMARY:
==28951==    definitely lost: 288 bytes in 1 blocks
==28951==    indirectly lost: 0 bytes in 0 blocks
==28951==      possibly lost: 7,241 bytes in 51 blocks
==28951==    still reachable: 1,084,994 bytes in 222 blocks
==28951==         suppressed: 0 bytes in 0 blocks

And this is the output after the 2nd one:

==28966== LEAK SUMMARY:
==28966==    definitely lost: 288 bytes in 1 blocks
==28966==    indirectly lost: 0 bytes in 0 blocks
==28966==      possibly lost: 7,241 bytes in 51 blocks
==28966==    still reachable: 1,249,474 bytes in 238 blocks
==28966==         suppressed: 0 bytes in 0 blocks

@kkoopa
Copy link
Collaborator

kkoopa commented Oct 26, 2015

You have not mentioned which version of Node this is, but I'll assume it's 4. Does not really matter, though, since I would not expect RSS to decrease here. nodejs/node#2631 (comment)

small chunks are allocated using sbrk and are not returned to the system (rss does not decrease)

You can also run node with --trace-gc to see when garbage collection happens. For reference, last time we had an actual leak, the system ran out of memory in 15 to 30 minutes.

@oferh
Copy link
Author

oferh commented Oct 26, 2015

Tried the same loop as in nodejs/node#2631 and it crashes after ~15 sec FATAL ERROR:CALL_AND_RETRY_LAST Allocation failed - process out of memory

Using node version 0.12.7

@oferh
Copy link
Author

oferh commented Oct 27, 2015

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants