-
Notifications
You must be signed in to change notification settings - Fork 150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error: memory limit exceeded. Function invocation was interrupted. #1077
Comments
@siddhant-mohan Please ignore my earlier comment about providing logs. Not sure how I missed them. My only theory for how we could leak memory doesn't seem to apply here. When you issue hundreds of parallel operations, we can spin up multiple GRPC clients which greatly increase the amount of memory required. It looks like your function fails with just a single GRPC client, which should not be happening. To look at this further, we probably need some investigation by our our backend team. They might have access to more detailed logs that explain your memory profile. Would you mind opening a ticket here: https://cloud.google.com/support-hub |
This is the response I got from the team. Can anyone help here? @schmidt-sebastian @yoshi-automation Sorry for the late response, this case took us more time to analyze. I received feedback from the engineering team, and we took a look at the instances and there are instances showing sustained growth until they die. The instances start off at 180-300MB, then they grow steadily for 30-60 seconds, at which point they hit 2G and die off. So, the growth is entirely in your container, I recommend you to investigate your code for leaks, profiling if necessary. This article explains the common memory leaks for Node JS and how to detect them. I believe this can help you ! |
Thanks for getting back to us. If all goes well, @bcoe knows where to route this form here :) |
I'm also getting this error when trying to encode an Image to a Blurhash. I have no detailled error descriptions, just the out-of-memory error. Any updates on this? |
@bcoe Have you seen issues like this before? I wonder if we should route this to the Functions team. |
@mrousavy @schmidt-sebastian sorry for missing the initial I'm not sure of the best workaround, it might be worth opening an issue on the GCF issue tracker. |
@siddhant-mohan Have you come across any solution of it? I am able to deploy using 2GB memory and 540s Timeout but not below that. And it is costing more than our budget. It would be helpful if somebody can help us. |
Hey any updates to this? I am having issues with a callable function that creates at max about 20k documents. I tried to increase the memory allowance to 512mb, but the weird part is that the logs show it never got past 180mb of usage. |
@MorenoMdz Same for me, the logs show no growth in memory usage beyond 150mb, and the runtime settings are 2GB. Still getting |
@jokasimr I have resolved this issue be eliminating global variables. This may be one of your problem also. |
@jainabhishek14 Thanks, I'll look into it some more. |
As Jainabhishek14 said, I also resolved it by refactoring my code. A few things to notice:
In my scenario, I moved the creation of the subcollection items to a trigger, for example, when doc A is created, the trigger will do its job and create the related docs. The part for me is that it now takes quite longer than before for the new documents to be ready to read. |
ping |
Environment details
@google-cloud/firestore
version:The output of the setLogFunction is:
I even increased the memory to 2GB but still it gives me same error if for a small cron job. I have even followed all the answers here but couldnt resolve. #768
Package.json file:
my firebase function file:
The text was updated successfully, but these errors were encountered: