Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error during deployment regarding concurrency #251

Closed
mvanaltvorst opened this issue Dec 30, 2021 · 5 comments · Fixed by #319
Closed

Error during deployment regarding concurrency #251

mvanaltvorst opened this issue Dec 30, 2021 · 5 comments · Fixed by #319
Labels
bug Something isn't working workaround available A workaround for the issue is available.

Comments

@mvanaltvorst
Copy link

Thank you for this project, it is precisely what I was looking for. However, I encounter an error when I try to deploy a basic configuration:

╷
389
│ Error: error setting Lambda Function (tf-next_tfn-deploy) concurrency: InvalidParameterValueException: Specified ReservedConcurrentExecutions for function decreases account's UnreservedConcurrentExecution below its minimum value of [50].
390
│ {
391
│   RespMetadata: {
392
│     StatusCode: 400,
393
│     RequestID: "005a79fe-c425-4a34-98e2-61d39be18bf3"
394
│   },
395
│   Message_: "Specified ReservedConcurrentExecutions for function decreases account's UnreservedConcurrentExecution below its minimum value of [50]."
396
│ }
397
│ 
398
│   with module.tf_next.module.statics_deploy.module.deploy_trigger.aws_lambda_function.this[0],
399
│   on .terraform/modules/tf_next.statics_deploy.deploy_trigger/main.tf line 14, in resource "aws_lambda_function" "this":
400
│   14: resource "aws_lambda_function" "this" {
401
│ 
402
╵

Have you encountered this error before, or do you know where I might be able to learn more about what might be causing this? Thank you.

@ofhouse
Copy link
Member

ofhouse commented Jan 11, 2022

Hey, sorry for the late answer.

Looks like this has something to do with other Lambda functions deployed to your AWS account.
Another issue from the Serverless framework provides some explanations why this error message might occur: https://forum.serverless.com/t/unreservedconcurrentexecution-below-its-minimum-value-of-100/10323/4

@ofhouse ofhouse added the question Question about usage of the library label Jan 11, 2022
@khuezy
Copy link

khuezy commented Jan 15, 2022

I'm also getting this error playing w/ the free-tier. My account has Full account concurrency of 50, and the proxy lambda consumes 1, which brings it under the minimum value 50.

How does one reduce the minimum value?

@khuezy
Copy link

khuezy commented Jan 15, 2022

@mvanaltvorst Did you get this resolved? If not, you can request a quota increase on your main region:
https://us-west-2.console.aws.amazon.com/servicequotas/home/services/lambda/quotas/L-B99A9384

I think anything above 50 would work, but I requested 200 just to be sure. It takes 2-3 days for AWS to confirm the request.

@ofhouse ofhouse added bug Something isn't working and removed question Question about usage of the library labels Feb 1, 2022
@ofhouse
Copy link
Member

ofhouse commented Feb 1, 2022

Saw this today in a newly created AWS account too.

Looks like AWS drastically decreased the quota value here:
Frame 1

Requesting an increase should work, will add this as a doc entry to the repo.

@ofhouse
Copy link
Member

ofhouse commented Feb 1, 2022

Added a support document for this issue here: Function decreases account's UnreservedConcurrentExecution below its minimum value

Also pinged the AWS support about the issue here: https://twitter.com/milliHQ/status/1488592847825780743

@ofhouse ofhouse added the workaround available A workaround for the issue is available. label Feb 1, 2022
@ofhouse ofhouse added this to the v1.0.0-canary.2 milestone May 31, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working workaround available A workaround for the issue is available.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants