-
Notifications
You must be signed in to change notification settings - Fork 29.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Increased memory usage in nodejs version 20.16.0 due to fetch() #54274
Comments
FWIW @snyamathi believes that this is caused by nodejs/undici@63b7794 CC @nodejs/undici |
Possible fix in nodejs/undici#3445 |
First, a memory leak happens when the memory is never released. In the diagram above, the gc is at work in claiming/releasing all the memory. Therefore, this bug is not a memory leak, but a less panicking "increased memory usage" bug. Second, I would really love to see a reproduction that is not using Next.js, because it monkey-patches fetch()... which might interact badly with the "fire and forget" logic of undici (we need to support fetches that do not consume the request). |
I've done a bit of research on this problem, and I came to these conclusions:
Something in the latest undici releases increased memory consumption for the Next.js cache. This does not affect vanilla We need a reproduction without Next.js or I've been trying to follow what their patching does, but I were not able to identify the problem. |
#54286 contains the patched version of undici. |
That PR has landed, so I've closed this issue. Feel free to reopen if you disagree. |
I am encountering this with an Express app (not Next.js) and it appears to be an actual leak in my case. The heap grows until the process crashes with I'll try distilling to a minimal repro case, it sounds like that could still be useful? Or are we likely all set with that PR? |
Open a new issue, this case is very specific to Next.js, so it'll likely be something else. |
We encountered this bug and confirmed that it still exists in 20.17. |
@kouhin the fix landed in undici v6.19.6 and Node 20.17 still uses undici v6.19.2 nodejs/undici@v6.19.5...v6.19.6 https://github.com/nodejs/node/blob/v20.17.0/src/undici_version.h#L5 edit: I've opened a PR to backport the fix to Node v20 #54583 |
The fix shipped in undici v6.19.7, which was shipped in Node v22.7.0. This will be part of the next v20 release if it happens after 2 weeks. @nodejs/releasers |
reopening to track until this ships in v20. |
@mcollina we are on v20, is there a suggested workaround rather than going to v22 or waiting for the next release or do we downgrade to 20.15.1 and use |
You can always download the latest |
@RedYetiDev we tried that however it didn’t work for us |
You may be experiencing a different issue, if so, please open in issue in nodejs/help or nodejs/undici |
This is getting off topic for this issue, but the global fetch (https://nodejs.org/dist/latest-v20.x/docs/api/globals.html#fetch) which comes with node itself is different than adding a dependency on the npm package In theory, you could patch the global fetch with the npm package, though in practice you're probably better off just sticking to a version before this issue was present (eg 20.15.1) or after the patch is backported (TBD). The reason you don't want to patch |
Adding up, had to downgrade to 20.15.1 to fix the leak on my project with Next.js (v14.2.5 - this didn't change). @RedYetiDev downloading undici would do the trick at the cost of having yet another dependency (and we should remind to remove it sooner or later when we'll upgrade node). |
Any ETA on this get back port to v20? |
The fix shipped in Node.js v20.18.0 |
Discussed in https://github.com/orgs/nodejs/discussions/54248
Originally posted by SteffenHo August 7, 2024
We saw a massive increase in memory in our next.js application after updating to version 20.16.0. We build our application with docker and the 20-alpine image. Other users were also able to confirm this problem: https://www.reddit.com/r/node/comments/1ejzn64/sudden_inexplicable_memory_leak_on_new_builds/
We then switched to version 20.15.1 and the memory returned to normal.
Maybe someone else could confirm this problem too?
@ronag could you have a look into this?
The text was updated successfully, but these errors were encountered: