-
Notifications
You must be signed in to change notification settings - Fork 29.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
console.log + XTerm cause memory leak #4671
Comments
Edit: doh. Apparently this case didn't fail there but does in XTerm.. |
Most probably this is a dupe of #1741, #2970, #3171, #4280. |
Oh, I see, it makes sense. It's just interesting, that in some moment XTerm (quite abruptly) stopped printing at all, I'd expect it slows down but this sudden stop is strange. |
@tomaskulich I will try to reproduce that in several hours just to be sure. |
This doesn't surprise me, you are blocking the event loop by doing an infinite js loop, wouldn't you expect node's event loop to slowly accumulate undone work if you never return to it? |
@sam-github that's perfectly fine. What surprises me, is that after ~4000 lines of output, I cease to see any new lines being printed. I'd expect the terminal to continue doing its work (at least, at some speed). As you say, since it cannot keep pace with node, node will eventually run out of memory, which is understandable. I'm not saying it is a big issue, I was worried, there is something much more serious behind this. |
I'm going to close this as it really sounds like a duplicate of all the other " |
When I run this simple script,
I got only around 3k - 5k numbers printed to my console and then
node
just stops in a very strange state:node --trace-gc
I noticed there is a lot of gc activity after the numbers stopped (see below).I'm using the newest version of XTerm on Arch (xterm-320-2).
Things that are not affecting the faulty behavior in any way:
v0.12.9
,v3.3.1
,v5.0.0
process.stdout.write
instead ofconsole.log
What makes the problem disappear:
output from
node --trace-gc myscript.js
:...
...
4447
4448
4449
4450
4451
4452
4453
4454
4455
4456[4091] 213 ms: Scavenge 9.8 (43.5) -> 6.9 (43.5) MB, 1.7 ms [allocation failure].
[4091] 225 ms: Scavenge 10.4 (43.5) -> 7.6 (44.5) MB, 1.9 ms [allocation failure].
[4091] 245 ms: Scavenge 11.1 (45.5) -> 8.2 (45.5) MB, 3.5 ms [allocation failure].
[4091] 269 ms: Scavenge 15.8 (45.5) -> 9.6 (45.5) MB, 2.9 ms [allocation failure].
[4091] 309 ms: Scavenge 16.5 (45.5) -> 10.8 (46.5) MB, 4.0 ms (+ 14.7 ms in 103 steps since last GC) [allocation failure].
[4091] 317 ms: Mark-sweep 11.5 (46.5) -> 10.0 (47.5) MB, 5.0 ms (+ 16.1 ms in 113 steps since start of marking, biggest step 0.3 ms) [GC interrupt] [GC in old space requested].
[4091] 355 ms: Scavenge 18.2 (47.5) -> 11.6 (47.5) MB, 2.5 ms [allocation failure].
[4091] 379 ms: Scavenge 18.5 (48.5) -> 12.8 (49.5) MB, 3.8 ms [allocation failure].
[4091] 407 ms: Scavenge 19.9 (49.5) -> 14.1 (50.5) MB, 7.1 ms [allocation failure].
[4091] 489 ms: Scavenge 29.2 (51.5) -> 16.8 (52.5) MB, 6.3 ms [allocation failure].
[4091] 549 ms: Scavenge 30.7 (52.5) -> 19.3 (55.5) MB, 7.7 ms [allocation failure].
[4091] 607 ms: Scavenge 33.4 (55.5) -> 21.9 (57.5) MB, 7.4 ms [allocation failure].
[4091] 662 ms: Scavenge 35.9 (57.5) -> 24.4 (60.5) MB, 7.1 ms [allocation failure].
[4091] 713 ms: Scavenge 38.5 (60.5) -> 26.9 (62.5) MB, 7.1 ms [allocation failure].
[4091] 765 ms: Scavenge 41.0 (63.5) -> 29.5 (65.5) MB, 7.0 ms [allocation failure].
[4091] 815 ms: Scavenge 43.5 (66.5) -> 32.0 (67.5) MB, 6.9 ms [allocation failure].
[4091] 866 ms: Scavenge 46.1 (68.5) -> 34.6 (70.5) MB, 7.1 ms [allocation failure].
[4091] 954 ms: Scavenge 48.6 (70.5) -> 37.1 (73.5) MB, 10.3 ms (+ 29.1 ms in 208 steps since last GC) [allocation failure].
[4091] 1034 ms: Scavenge 51.2 (73.5) -> 39.7 (75.5) MB, 7.6 ms (+ 25.1 ms in 208 steps since last GC) [allocation failure].
[4091] 1050 ms: Mark-sweep 40.5 (75.5) -> 34.8 (77.5) MB, 11.8 ms (+ 56.1 ms in 429 steps since start of marking, biggest step 0.3 ms) [GC interrupt] [GC in old space requested].
[4091] 1103 ms: Scavenge 50.9 (77.5) -> 37.7 (77.5) MB, 4.6 ms [allocation failure].
[4091] 1152 ms: Scavenge 51.4 (77.5) -> 40.1 (79.5) MB, 7.1 ms [allocation failure].
[4091] 1200 ms: Scavenge 54.2 (79.5) -> 42.7 (81.5) MB, 6.5 ms [allocation failure].
[4091] 1249 ms: Scavenge 56.7 (81.5) -> 45.2 (83.5) MB, 6.7 ms [allocation failure].
[4091] 1297 ms: Scavenge 59.3 (83.5) -> 47.8 (84.5) MB, 6.4 ms [allocation failure].
...
...
The text was updated successfully, but these errors were encountered: