-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak in consumers framework API? #546
Comments
I can verify this is happening/ |
@albnnc to unblock you please use the consume API like this: const consumer = await js.consumers.get("messages", "myconsumer");
const iter = await consumer.consume({max_messages: 1});
for await(const m of iter) {
console.log(m?.info.redeliveryCount);
m?.nak();
} |
The above will do the same, but doesn't trigger the leak - at any one point the consumer will only have one message which is effectively what you are doing with the next() on the loop |
@albnnc I have figured out where the issue is, while I make a release, the above suggestion will unblock you. Thank you for finding this. |
@aricart, thanks for the response! What I need actually is pulling a bunch of messages, limited to a certain maximum count, and process them while blocking the receiving of messages. With const consumer = await js.consumers.get("ENTITY", "ENTITY_CONSUMER");
const batchSizeMax = 1_000;
let batch: JsMsg[] = [];
while (!nc.isClosed()) {
const iter = await consumer.fetch({ max_messages: batchSizeMax });
for await (const msg of iter) {
batch.push(msg);
if (!msg.info.pending) {
break;
}
}
if (batch.length) {
console.log(`Processing batch of size ${batch.length}`);
console.log("redeliveryCount", batch[0].info.redeliveryCount);
batch.forEach((v) => v.nak());
batch = [];
} else {
await delay(1_000);
}
} However, this code leaks too (but much slower actually). Will it be fixed in the upcoming PR? |
@albnnc the leak in fetch would also be fixed by the same PR - but I will make sure (running the tests right now) If I understand correctly what you want is to retrieve N number of messages that you can assign to workers and can process concurrently. If that is the case, then fetch is what you want. See https://github.com/nats-io/nats.deno/blob/main/jetstream.md#fetching-batch-of-messages |
@albnnc yes the PR fixes fetch as well. |
Yeah, I saw that part of docs. The notable part is that I need to start handling messages immediately, even when there are less than
I can confirm this too. Will wait for release, thanks! |
@albnnc I have release new versions of all the clients fixing this issue, if you notice anything else please holler. |
My team was using
js.pullSubscribe
, but we found the new API kinda comfortable to work with and decided to update our codebase. However, we were unable to migrate the codebase without troubles.Consider the following code snippet:
Consumer Info
Memory will leak dramatically making an app to fail with V8 error.
Is there any principal mistake in the code above? Thanks in advance.
The text was updated successfully, but these errors were encountered: