Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

undici fetch has memory leak #1108

Closed
cjh980402 opened this issue Nov 21, 2021 · 39 comments
Closed

undici fetch has memory leak #1108

cjh980402 opened this issue Nov 21, 2021 · 39 comments
Labels
bug Something isn't working

Comments

@cjh980402
Copy link
Contributor

cjh980402 commented Nov 21, 2021

Bug Description

When I use node-fetch or axios, then there are no memory leak like under screenshot.
However, I found memory leak when I use undici.fetch.

Reproducible By

I made request about 50 times~200times then I found that memory leak easily.

Expected Behavior

there are no memory leak.

Logs & Screenshots

image

Environment

Ubuntu 20.04
Node v16.13.0

Additional context

@cjh980402 cjh980402 added the bug Something isn't working label Nov 21, 2021
@ronag
Copy link
Member

ronag commented Nov 21, 2021

Do you have a repro?

@cjh980402
Copy link
Contributor Author

cjh980402 commented Nov 21, 2021

import { fetch } from 'undici';
import { setTimeout } from 'timers/promises';

const arr = [];
console.log(process.memoryUsage().heapUsed);

for (let i = 0; i < 200; i++) {
    if (i % 10 == 0) {
        console.log(process.memoryUsage().heapUsed);
    }
    for (let j = 0; j < 20; j++) {
        await (await fetch('https://www.youtube.com/')).text();
    }
    arr.push((await fetch('https://naver.com')).text());
}

await Promise.all(arr);
console.log(process.memoryUsage().heapUsed);

await setTimeout(10000);

console.log(process.memoryUsage().heapUsed);

await setTimeout(100000);

console.log(process.memoryUsage().heapUsed);

It has this result.
image
There are some memory leak so finally memory usage is increased.

@mcollina
Copy link
Member

mcollina commented Nov 21, 2021

At first glance, your code behaves more or less how I expected. There should not be any memory leaks there.

(I should probably spend some time in documenting how this happen - however I'm a bit short on time right now).

@Ethan-Arrowood
Copy link
Collaborator

It would be valuable to try to reproduce this issue with just undici.request to isolate the issue to the fetch implementation.

Some areas within fetch it could happen is in the Body Mixin. We should ensure that once something like .text() is awaited that the stream is closed and destroyed.

@mcollina
Copy link
Member

mcollina commented Dec 4, 2021

@Ethan-Arrowood My understanding is that there is no issue. All the requests are still referenced in the array and they can't be collected.

@ronag
Copy link
Member

ronag commented Dec 8, 2021

@cjh980402 can you try again with the latest release?

@cjh980402
Copy link
Contributor Author

image
As a result of testing, it is much more stable than the previous version.
However, I will have to wait and see the situation by applying it to the actual application.

@kyrylkov
Copy link
Contributor

kyrylkov commented Dec 10, 2021

I can confirm quite severe memory leak still present with Node v16.13.1 and undici v4.11.3

The issue is described in Node.js non-heap memory leak.

As soon as we replace undici with node-fetch, the issue disappears.

However do we know if the issue is with undici itself or with Web Streams / Buffer.blob?

Also switching back to node-fetch results in smaller GC pause times and smaller heap utilization. See chart below where just before 6pm node-fetch was deployed instead of unidici.

imagen

@ronag
Copy link
Member

ronag commented Dec 10, 2021

@kyrylkov: Can you show how you use fetch?

@kyrylkov
Copy link
Contributor

@ronag There are multiple cases. Let me collect some.

@ronag
Copy link
Member

ronag commented Dec 10, 2021

Can you try with this branch? https://github.com/nodejs/undici/tree/finalization

@kyrylkov
Copy link
Contributor

  const accessTokenRs = await fetch(URL, {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
    },
    body: JSON.stringify(params),
  });
  const accessTokenRsJson = await accessTokenRs.json();
  await fetch(`${url}/lists/${listId}/members/${userHash}`, {
    method: 'DELETE',
    headers: {Authorization: BASIC_AUTH_KEY},
  });
  const res = await fetch(`${url}/lists/${listId}/members`, {
    method: 'GET',
    headers: {Authorization: BASIC_AUTH_KEY},
  });
  const resJson = await res.json();
  const res = await fetch(getUrl, {
    method: 'GET',
    headers:{
      'Authorization': `Bearer ${TOKEN}`,
    },
  });
  const text = await res.text();

@kyrylkov
Copy link
Contributor

Can you try with this branch? https://github.com/nodejs/undici/tree/finalization

Trying....

@ronag
Copy link
Member

ronag commented Dec 10, 2021

await fetch(`${url}/lists/${listId}/members/${userHash}`, {
    method: 'DELETE',
    headers: {Authorization: BASIC_AUTH_KEY},
  });

You need to consume the response. See, https://github.com/nodejs/undici#garbage-collection.

@kyrylkov
Copy link
Contributor

kyrylkov commented Dec 10, 2021

You need to consume the response. See, https://github.com/nodejs/undici#garbage-collection.

@ronag Alright. But this cannot cause the leak cause this specific code is called once per week at most.

@kyrylkov
Copy link
Contributor

Can you try with this branch? https://github.com/nodejs/undici/tree/finalization

@ronag GC pauses immediately got at least 2x worse. Need more time to see if memory leak is still present.

@kyrylkov
Copy link
Contributor

@ronag It's leaking again.

imagen

@mcollina
Copy link
Member

The following does not leak memory:

import { fetch } from './index.js';
import { setTimeout } from 'timers/promises';

for (let i = 0; i < 200; i++) {
    if (i % 10 == 0) {
        console.log(process.memoryUsage().heapUsed / 1024 / 1024);
    }
    for (let j = 0; j < 20; j++) {
        await (await fetch('https://www.youtube.com/')).text();
    }
    await (await fetch('https://naver.com')).text();
}
console.log('all req sent');

console.log(process.memoryUsage().heapUsed / 1024 / 1024);

await setTimeout(10000);

console.log(process.memoryUsage().heapUsed / 1024 / 1024);

await setTimeout(100000);

console.log('end');

@kyrylkov
Copy link
Contributor

@mcollina The leak we're observing is non-heap memory, so my understanding the test above wouldn't show it.

@mcollina
Copy link
Member

Could you please include a test to reproduce?

What's exactly is the scale of those graphs?

What's the output you're seeing when Node is giving you the out of memory error?

@kyrylkov
Copy link
Contributor

kyrylkov commented Dec 11, 2021

Could you please include a test to reproduce?

We don't have a test because it's a big app and we've been chasing non-heap memory leak for weeks. We seem to isolate the cause to undici fetch vs node-fetch.

What's exactly is the scale of those graphs?

Total memory on the chart in the Stack Overflow question is 768MB.

What's the output you're seeing when Node is giving you the out of memory error?

We would need to allow it to crash again to see exact out of memory error. The problem is that GC pauses increase 2x-3x as soon as we switch to undici fetch which negatively affects app performance in production. Thus running these tests till out of memory is both time consuming and bad for production application experience overall. We don't have a way to reproduce it outside of production environment.

@mcollina
Copy link
Member

We would need to allow it to crash again to see exact out of memory error. The problem is that GC pauses increase 2x-3x as soon as we switch to undici fetch which negatively affects app performance in production. Thus running these tests till out of memory is both time consuming and bad for production application experience overall. We don't have a way to reproduce it outside of production environment.

The finalization branch linked above should fix/resolve it. This is actually pretty bad and possibly a reason to not be spec-compliant @ronag and drop the FinalizationRegistry.

@kyrylkov
Copy link
Contributor

@mcollina What we can and will do next week is try to run for 24 hours with undici fetch, recording memory and GC profiles and then another 24 hours with node-fetch. That should answer lots of questions.

@mcollina
Copy link
Member

We really need a way to reproduce the problem it we want to fix it.

@therapy
Copy link

therapy commented Dec 15, 2021

heres a good way to reproduce the memory leak

index.js:

const { Worker } = require('worker_threads');
const os = require('os');

for (let i = 0; i < 50; i++) {
	new Worker('./worker.js');
}

setInterval(() => {
    console.log(os.freemem());
}, 1000)

worker.js

const { fetch } = require("undici");

const main = async () => {
    const res = await fetch("https://jsonplaceholder.typicode.com/posts/1");
    const json = await res.json();

    main();
};

main();

result with undici fetch:

8510468096
8388669440
8287883264
8248619008
8189575168
8142159872
8103874560
8013099008
7929524224
7844200448
7754084352
7699202048
7761747968
7981531136
8034414592
8069316608
8003137536
7938416640
7885611008
7810236416
7747260416
7687892992
7635963904
7576334336
7512018944
7481835520
7407083520
7363784704
7309643776
7222276096
7145009152
7120793600
7067721728
7029264384
6971269120
6915682304
6861324288
6814982144
6746136576
6654898176

exact same program with node-fetch:

8516784128
8501202944
8491171840
8480931840
8474705920
8485122048
8497430528
8497270784
8625246208
8608874496
8605544448
8605483008
8598585344
8599298048
8598294528
8597331968
8600895488
8595931136
8596119552
8594587648
8593002496
8598953984
8598147072

@mcollina
Copy link
Member

I have tested this for about 10 minutes. Eventually it stabilizes but it degrades pretty sharply.
My understanding is this is somewhat related to the FinalizationGroup handling that we do with fetch() to handle some of the spec-required stuff.

@ronag I think we might want to remove it.

@ronag
Copy link
Member

ronag commented Dec 15, 2021

But we removed that?

@mcollina
Copy link
Member

It seems it's still there:

const requestFinalizer = new FinalizationRegistry(({ signal, abort }) => {
.

@szmarczak
Copy link
Member

@mcollina This one has been added after the removal:

image
image

@mcollina
Copy link
Member

I've tried removing that and it's not the cause or the problem.

Technically it's not a leak but there is some very odd memory behavior that we should fix.

@ronag
Copy link
Member

ronag commented Dec 16, 2021

I haven't really spent much time optimising allocations. It might also be related to web streams.

@mcollina

This comment has been minimized.

@mcollina
Copy link
Member

I cannot reproduce it directly when using a single process. This is quite interesting:

const { fetch } = require(".");
const { isMainThread } = require('worker_threads');
const os = require('os');

const main = async () => {
    const res = await fetch("https://jsonplaceholder.typicode.com/posts/1");
    const json = await res.json();

    main();
};

main();
main();
main();
main();
main();

if (isMainThread) {
  setInterval(() => {
      console.log(os.freemem());
  }, 1000)
}

See
Screenshot 2021-12-16 at 10 56 38

Very interestingly most of the allocations comes from Node core internals:

Screenshot 2021-12-16 at 10 58 00

Going back to the multithreaded example, on my system the RSS stabilizes at 1.5 GB of RAM. My theory for this is that each worker uses about 30 MB of RAM (* 50 = 1.5 GB) because of the activity they need to do - this is consistent for the heap requirement of running a Node.js application.

See nodejs/node#34823 for more details on the topic of the memory requirements of worker threads.

Now, undici fetch() is not gentle on memory allocations and it should probably be optimize to reduce the memory consumption, but this can be closed.

@ronag
Copy link
Member

ronag commented Dec 16, 2021

Landing this in core will probably help some nodejs/node#41121

@mcollina
Copy link
Member

Hopefully that'll happen soon.

@ronag
Copy link
Member

ronag commented Dec 22, 2021

Didn't make it into Node 17.3. Please try later with Node 17.4+ and Undici 4.12.1+

@ronag
Copy link
Member

ronag commented Dec 22, 2021

Cross that. It did make it into node 17.2. Please try with Node 17.2+ and Undici 4.12.1+.

@therapy
Copy link

therapy commented Dec 22, 2021

heres a good way to reproduce the memory leak

index.js:

const { Worker } = require('worker_threads');
const os = require('os');

for (let i = 0; i < 50; i++) {
	new Worker('./worker.js');
}

setInterval(() => {
    console.log(os.freemem());
}, 1000)

worker.js

const { fetch } = require("undici");

const main = async () => {
    const res = await fetch("https://jsonplaceholder.typicode.com/posts/1");
    const json = await res.json();

    main();
};

main();

i have re-ran this:

https://pastebin.com/wqU6f14F

undici 4.12.1
node v17.3.0

looks like the memory usage is still very weird, but it seems to stabilize after 90 seconds while using 1.5gb of ram

I have tested this for about 10 minutes. Eventually it stabilizes but it degrades pretty sharply. My understanding is this is somewhat related to the FinalizationGroup handling that we do with fetch() to handle some of the spec-required stuff.

@ronag I think we might want to remove it.

I've tried removing that and it's not the cause or the problem.

Technically it's not a leak but there is some very odd memory behavior that we should fix.

Going back to the multithreaded example, on my system the RSS stabilizes at 1.5 GB of RAM. My theory for this is that each
worker uses about 30 MB of RAM (* 50 = 1.5 GB) because of the activity they need to do - this is consistent for the heap
requirement of running a Node.js application.

@awinogrodzki
Copy link

awinogrodzki commented Oct 3, 2024

To anyone revisiting this issue in 2024:

I encountered very similar memory leak caused by retained undici requests in FinalizationRegistry, both locally and in production environment.

After hours of experimentation I narrowed down the issue to be related with Node.js 20.X on ARM processors. For some reason the leak was evident and caused a linear grow on memory only on Node.js 20 (Disclaimer: Tested only on the latest 20.17 LTS and 20.16). When I downgraded the Node.js to 18 or upgraded to the current, Node.js 22, the leak was no longer present. It also may add to the fact that both my local device and the production server are based on ARM architecture.

If you're researching a leak and run the code on Node.js 20, it's probably a good idea to run a quick test against other Node versions before diving into implementation

Here are example snapshots after running the same test on Node.js 20.17 vs Node.js 22.9

Operating System:
  Platform: darwin
  Arch: arm64
  Version: Darwin Kernel Version 23.6.0: Mon Jul 29 21:14:30 PDT 2024; root:xnu-10063.141.2~1/RELEASE_ARM64_T6030
Binaries:
  Node: 20.17.0
  npm: 10.8.2
  Yarn: 4.1.0
Screenshot 2024-10-03 at 20 06 16
Operating System:
  Platform: darwin
  Arch: arm64
  Version: Darwin Kernel Version 23.6.0: Mon Jul 29 21:14:30 PDT 2024; root:xnu-10063.141.2~1/RELEASE_ARM64_T6030
Binaries:
  Node: 22.9.0
  npm: 10.8.3
  Yarn: 4.1.0
Screenshot 2024-10-03 at 20 09 47

As you see, the memory leak is simply gone after switching from Node 20.17 to 22.9

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

8 participants