-
Notifications
You must be signed in to change notification settings - Fork 476
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak #512
Comments
Do you reckon you can come up with a minimal example that shows the behaviour? (Are there any preconditions other than say, using more than one channel?) How are the nested streams nested? |
Sorry for the slow reply @squaremo. I'll have a think about how I could replicate this without using private code, but I would also point out that we only observe this under reasonably heavy load. The streams are nested in as much as each stream has a |
@house92 did y'all ever figure out the solution to this issue? We are seeing the same pattern currently, where usage slowly grows and then spikes before dropping back to normal. |
Maybe #658 improves the memory footprint. |
I've run the following script using const random = require('random-readable');
const { v4: uuid } = require('uuid')
const amqplib = require('amqplib');
(async () => {
const stats = { published: 0, received: 0 }
const publisherConnection = await amqplib.connect('amqp://localhost');
const publisherChannel = await publisherConnection.createChannel();
await publisherChannel.assertQueue('test', { durable: false, autoDelete: true });
const consumerConnection = await amqplib.connect('amqp://localhost');
const consumerChannel = await consumerConnection.createChannel();
const consumerTag = uuid();
await consumerChannel.consume('test', (message) => {
dumpStats();
consumerChannel.ack(message);
stats.received++;
}, { noAck: false, consumerChannel });
const stream = random
.createRandomStream()
.on('error', console.error)
.on('data', (data) => {
const ok = publisherChannel.sendToQueue('test', data);
if (!ok) stream.pause();
stats.published++;
})
.on('end', () => {
console.log('end');
});
publisherChannel.on('drain', () => {
stream.resume();
})
process.once('SIGINT', async () => {
stream.pause();
publisherChannel.removeAllListeners('drain');
publisherChannel.on('drain', async () => {
await publisherChannel.close();
await publisherConnection.close();
})
await consumerChannel.cancel(consumerTag);
await consumerChannel.close();
await consumerConnection.close();
dumpStats();
process.exit(0);
})
function dumpStats() {
process.stdout.clearLine(0);
process.stdout.cursorTo(0);
process.stdout.write(`published: ${stats.published}, received: ${stats.received}`);
}
})(); The total memory footprint of the node process stayed at under 90MB. I realise this is an old issue, but am wondering if you ever got to the bottom of it. Could the problem have been that when under high load, your application was receiving more work than it could handle? You'll see in the above example I'm careful to pause the stream, so that I don't flood the publisher channel. |
Closing as an old issue and unable to reproduce |
To outline our app:
We first observed a gradual memory creep followed by a spike, and having recorded the memory heap allocation using Chrome dev tools it appears that this is related to the
Mux
class, being called insideConnection
. We end up with many streams that are nested hundreds of thousands of times, totalling over 200MB.Before delving deeper, I wanted to check with you what the function of the
Mux
class is and whether this sounds like a symptom of our misuse of the library or an area for potential optimisation within your code. If the latter, I’m still happy to fork the repo and look into it, but any clarification would be very welcome.The text was updated successfully, but these errors were encountered: