-
Notifications
You must be signed in to change notification settings - Fork 9.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: aws_lambda_function
crashes with fatal error: out of memory
#31571
Comments
Community NoteVoting for Prioritization
Volunteering to Work on This Issue
|
aws_lambda_function
crashes with runtime: out of memory:
aws_lambda_function
crashes with runtime: out of memory:
aws_lambda_function
crashes with fatal error: out of memory
This is not a bug within the provider. per documentation, package lambdas cannot exceed 50MB compressed and 250MB uncompressed. Your zip file is violating both limits. You can move to image based lambdas by building your own container. |
Hey @JonathanRys 👋 I think this might be a combination of a couple of things. There is a bug that we're working to address (hashicorp/aws-sdk-go-base#490) that has caused some out of memory errors like you saw here (#31644, #31570) that I suspect you're also hitting. Based on the above comment, once we've patched that bug, you'll probably start to see other errors with the configuration until you take the actions @a087674 laid out. With that in mind, would you feel comfortable with us closing this issue out? |
hashicorp/aws-sdk-go-base#490 addresses a problem with large response payloads and logging, so it won't solve this one. This error has to do with serializing the large Zip payload in the request. |
Warning This issue has been closed, meaning that any additional comments are hard for our team to see. Please assume that the maintainers will not see them. Ongoing conversations amongst community members are welcome, however, the issue will be locked after 30 days. Moving conversations to another venue, such as the AWS Provider forum, is recommended. If you have additional concerns, please open a new issue, referencing this one where needed. |
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. |
Terraform Core Version
1.4.6
AWS Provider Version
4.67.0
Affected Resource(s)
aws_lambda_function
Expected Behavior
It should have deployed my lambda
Actual Behavior
runtime: out of memory: cannot allocate 671088640-byte block (1065025536 in use)
fatal error: out of memory
Zip file size: 305 MB (320,278,949 bytes)
Relevant Error/Panic Output Snippet
Terraform Configuration Files
Steps to Reproduce
This worked fine with source_file instead of source_dir except that my libraries weren't available :(. I can't mix the two, so I moved the lambda handler and packages into a folder and used source_dir instead, that's when I got this error.
my
requirements.txt
:Debug Output
https://gist.github.com/JonathanRys/43815164fdca16f1a3752ad43a9c221e#file-crash-log
Panic Output
see above
Important Factoids
Processor Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz 4.01 GHz
Installed RAM 16.0 GB
System type 64-bit operating system, x64-based processor
Pen and touch No pen or touch input is available for this display
Edition Windows 10 Pro
Version 22H2
Installed on 11/17/2021
OS build 19045.2846
Experience Windows Feature Experience Pack 120.2212.4190.0
I tried removing the preinstalled packages and installing requirements.txt on the server like so:
but also got an out of memory error that way. I tried to zip it myself and got a smaller file: 290 MB (304,642,515 bytes), but it also failed with the same error. I tried it with 512MB of memory and it still failed. I tried using
pip install --only-binary
but it made no difference. I tried to download it from S3, but it is apparently too large. That is probably the source of the issue but I would expect a more graceful error from the pluginReferences
May be related to #29153
Would you like to implement a fix?
None
The text was updated successfully, but these errors were encountered: