Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Elastic Beanstalk removes all settings when config template is applied #6636

Closed
tecnobrat opened this issue May 12, 2016 · 3 comments
Closed

Comments

@tecnobrat
Copy link

Running terraform 0.6.16.

If you have configuration that looks like:

resource "aws_elastic_beanstalk_application" "app" {
    name = "beanstalk-app"
    description = ""
}

resource "aws_elastic_beanstalk_environment" "environment" {
    name = "beanstalk-env"
    application = "${aws_elastic_beanstalk_application.app.name}"
    solution_stack_name = "64bit Amazon Linux 2016.03 v2.1.0 running Multi-container Docker 1.9.1 (Generic)"

    setting {
        namespace = "aws:elasticbeanstalk:application:environment"
        name = "NEW_RELIC_NO_CONFIG_FILE"
        value = "true"
    }

    setting {
        namespace = "aws:ec2:vpc"
        name = "ELBScheme"
        value = "internal"
    }
}

But then you change it to use a config template, like:

resource "aws_elastic_beanstalk_application" "app" {
    name = "beanstalk-app"
    description = ""
}

resource "aws_elastic_beanstalk_environment" "environment" {
    name = "beanstalk-env"
    application = "${aws_elastic_beanstalk_application.app.name}"
    template_name = "${aws_elastic_beanstalk_configuration_template.template.name}"

    setting {
        namespace = "aws:elasticbeanstalk:application:environment"
        name = "NEW_RELIC_NO_CONFIG_FILE"
        value = "true"
    }
}

resource "aws_elastic_beanstalk_configuration_template" "template" {
    name = "beanstalk-config"
    application = "${aws_elastic_beanstalk_application.app.name}"
    solution_stack_name = "64bit Amazon Linux 2016.03 v2.1.0 running Multi-container Docker 1.9.1 (Generic)"

    setting {
        namespace = "aws:ec2:vpc"
        name = "ELBScheme"
        value = "internal"
    }
}

You will see a plan correctly saying it is going to remove ELBScheme from the environment and leave the NEW_RELIC_NO_CONFIG_FILE untouched, but when it applies, the NEW_RELIC_NO_CONFIG_FILE will be missing if you check beanstalk on AWS. If you run another plan / apply, then NEW_RELIC_NO_CONFIG_FILE will be added back to the environment successfully.

@catsby
Copy link
Contributor

catsby commented Jul 22, 2016

Hey @tecnobrat – this is odd. I believe what we're seeing is that, upon switching to a Template, AWS is clearing out the settings for this env and only using the settings found in the Template. I'm not sure where specifically this is happening, but I'll continue to investigate!

@catsby
Copy link
Contributor

catsby commented Jul 28, 2016

Hey @tecnobrat

So, what I've found is that this behavior is an artifact of both declaring settings in-line and moving your environment to a template. When you move your environment to use a template, we effectively clear our the existing settings. Thus the removal of NEW_RELIC_NO_CONFIG_FILE. Terraform however does not quite understand that consequence, and only see's a change in the config where ELBScheme is concerned. So the template change wipes out ELBScheme and NEW_RELIC_NO_CONFIG_FILE, however the template itself restores the ELBScheme. Your next plan shows that it wants to add NEW_RELIC_NO_CONFIG_FILE because it's no longer found and not in the template.

Unfortunately there isn't a lot I can do here at this time. It still seems reasonable to allow in-line settings, so that you can override any template ones if necessary. It's just this scenario of switching over from a stack name with no template, to a template with a mix of settings, where things get lost. I believe this is a property of the AWS API, and you'd see the same results if you were to use the API directly (or with the aws CLI)

@ghost
Copy link

ghost commented Apr 23, 2020

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@ghost ghost locked and limited conversation to collaborators Apr 23, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants