Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AWS limit exceeded not bubbled up to console #11143

Closed
JonCubed opened this issue Jan 11, 2017 · 7 comments
Closed

AWS limit exceeded not bubbled up to console #11143

JonCubed opened this issue Jan 11, 2017 · 7 comments
Assignees

Comments

@JonCubed
Copy link

I'm currently getting the following in the console

Error applying plan:

1 error(s) occurred:

* aws_kinesis_firehose_delivery_stream.kinesis_stream: timeout while waiting for state to become 'success' (timeout: 1m0s)

Took a while to find the cause but in the log file, it was showing that the request failed due to a bad request, because I have exceeded my firehose stream limit.

plugin: terraform.exe: aws-provider (internal) 2017/01/11 21:55:01 [DEBUG] [aws-sdk-go] DEBUG: Response firehose/CreateDeliveryStream Details:
plugin: terraform.exe: ---[ RESPONSE ]--------------------------------------
plugin: terraform.exe: HTTP/1.1 400 Bad Request
plugin: terraform.exe: Connection: close
plugin: terraform.exe: Content-Length: 1116
plugin: terraform.exe: Content-Type: application/x-amz-json-1.1
plugin: terraform.exe: Date: Wed, 11 Jan 2017 10:55:02 GMT
plugin: terraform.exe: X-Amz-Id-2: Tq1q13oZctAAEJmln37QYtKfUZzYuswX3h/T7EauJur8vX+wgCDCnbny+yYXGsTdueVt91vfZbzoeSkEhxymVw==
plugin: terraform.exe: X-Amzn-Requestid: 674975a9-d7ec-11e6-a147-05a1bd9a3992
plugin: terraform.exe: 
plugin: terraform.exe: {"__type":"LimitExceededException","message":"You have already consumed your firehose quota of 20 hoses. Firehose names: [...]"}

Would be great if this error could bubble up to the console so we have a more meaningful error.

Terraform Version

0.8.2

Affected Resource(s)

aws_kinesis_firehose_delivery_stream (probably others)

Steps to Reproduce

  1. Create 20 firehose streams (or what ever your max limit)
  2. try to add one more with terraform
@afex
Copy link

afex commented Jan 12, 2017

i am seeing similiar behavior when attempting to create an aws_kinesis_stream which would exceed my shard limit.

@netjunki
Copy link
Contributor

I hit another one of these with an aws_appautoscaling_target:

{"__type":"LimitExceededException","Message":"The maximum limit of 500 scalable targets per service namespace has been reached"}

@grubernaut if you can give me some guidance about where in the code to look to try and deal with this more generally I'd be happy to take a stab at a patch.

@grubernaut
Copy link
Contributor

@netjunki unfortunately with the aws provider we don't have direct access to the API, so we cannot directly capture the 400 returned from EC2 in a general case. However, we can capture and handle the error correctly at the resource level. I can dig in later, but for now there should be quite a few cases of error handling from the aws-sdk in the aws_instance resource

@netjunki
Copy link
Contributor

@grubernaut I know how to do that already. :-) I'll see if I can pull some patches together.

@grubernaut
Copy link
Contributor

This looks related to the dynamodb issue #13339, with the possible bug living inside of the potential throttling happening inside the AWS SDK. Upstream issue: aws/aws-sdk-go#1271

@grubernaut
Copy link
Contributor

Closed via #14571

@ghost
Copy link

ghost commented Apr 12, 2020

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@ghost ghost locked and limited conversation to collaborators Apr 12, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

4 participants