-
Notifications
You must be signed in to change notification settings - Fork 598
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory Leak in requests for Datastore #1414
Comments
Thanks for reporting! In the case you showed, 50 MB used after making API requests at that rate seems reasonable-- 20 calls per second after 20 minutes would be 24,000 calls. Even though unused objects are on the chopping block, Node can be pretty relaxed with how quickly it runs the garbage collector. However, we have had reports of memory usage being high with the Datastore API, specifically our use of the gRPC library. We've made some improvements, such as caching open connections, but that's already in 0.36.0. Can you put a sample together that uses more than 1 GB of memory? If it was reproducible to the point of just running |
Hi @stephenplusplus, I have published the example app that we are using in https://github.com/seedtag/test-gcloud. To run it you would need to provide a project id in package.json and ensure to authenticate (we are doing that with the init.sh script that we execute from our Dockerfile). I increased the requests per second (as our real system needs to deal with a lot of requests) and you can see in the chart how it keeps growing beyond 1GB in around 3 hours. We also tried to invoke the GC manually, but we had the same result and the memory was not freed. Let me know if I can provide any other information to help with this! Thank you! |
Thank you for putting that together! I let the test run for about 3 hours as well, and my memory usage got up to ~1GB. I made some modifications to the test file to monitor the memory with const gcloud = require('gcloud')();
const datastore = gcloud.datastore();
var BASE_MEMORY = getMemory();
var memoryUsage;
function getUrl(urlId) {
const key = datastore.key(['Url', urlId]);
datastore.get(key, () => {});
}
function getMemory() {
return Math.round(process.memoryUsage().heapUsed / 1000000);
}
function logMemoryUsage() {
console.log(getMemory() - BASE_MEMORY + ' mb');
}
setInterval(() => {
getUrl('www.test-url.com');
}, 25);
setInterval(() => {
if (memoryUsage !== (memoryUsage = getMemory())) {
logMemoryUsage();
}
}, 1000); I'll continue looking into this and keep you posted. Let me know if you make any discoveries as well! |
Let's move the talk over to grpc/grpc#7349, as I think gRPC is the best place to look for the hike at the moment. If we find out we need to make a change inside this library, we'll follow up with a new issue. |
Hello @stephenplusplus, Is there any estimated date for the fix? I have followed the discussion in grpc/grpc#7349 and tried to use a gcloud-node fork using the branch with the fix but I got this error when connecting to datastore (I couldn't reproduce it using the emulator): TypeError: callback is not a function Thank you! |
@stephenplusplus it looks like grpc 1.0.0 includes the fix for this issue. Any timeline on when this version will make it to google-cloud-node? Thanks! |
I'm going to start on that now, thanks for the reminder! #1543 |
That was fast, thanks a lot! :D |
It looks like there is a memory leak in this very simple example that I've done to test Datastore. I'm using node 6.2 and gcloud 0.36.0:
After running it for 20 minutes the memory used increased in 50 MB. Is there a bug with that or am I doing something wrong?
I have made some other tests with a more complex code that after more time it ended using more than 1 Gigabyte.
The text was updated successfully, but these errors were encountered: