Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

interpolation of files #215

Closed
payneio opened this issue Aug 21, 2014 · 30 comments · Fixed by #1778
Closed

interpolation of files #215

payneio opened this issue Aug 21, 2014 · 30 comments · Fixed by #1778

Comments

@payneio
Copy link

payneio commented Aug 21, 2014

I believe it would be useful to extend the file(..) function to allow for variable replacement. This would allow, for example, passing in terraform variables into a cloud-config.yaml.

Perhaps as simple as opening up the interpolation function:
user_data = ${interpolate(file("cloud-config.yaml"))}

Or maybe just an optional attribute on the file function:
user_data = ${file("cloud-config.yaml", true)}

@mitchellh
Copy link
Contributor

Dup of #159

@payneio
Copy link
Author

payneio commented Aug 21, 2014

Not the same issue. I'm talking about allowing terraform variables to be inserted into files included with the file function.

@mitchellh
Copy link
Contributor

Yep, just noticed, reopening. Sorry about that.

@mitchellh mitchellh reopened this Aug 21, 2014
@mitchellh
Copy link
Contributor

So, we've discussed supporting this. The primary issue is that graph creation won't work because we can't inspect ahead of time what variables will be needed for the interpolation. For example:

foo = "${file("bar.txt")}"

And bar.txt is:

Hello: ${aws_instance.foo.id}

Unless we load "bar.txt" early, we can't know that a dependency must be made to aws_instance.foo.

So, we could load the file early. But we can't, because that parameter might itself be a variable:

foo = "${file(var.foo)}"

And we can't know that ahead of time. A solution then, is to perhaps specify in advance what variables you need. Hypothetical syntax:

foo = "${file(var.foo, bar=aws_resource.foo.id)}"

And bar.txt:

Hello: ${var.bar}

@mitchellh
Copy link
Contributor

I've tagged this as an enhancement since it is something we want one day, but don't plan on doing this quite yet.

@pearkes
Copy link
Contributor

pearkes commented Aug 21, 2014

@payneio A workaround for your issue may be using a heredoc, though. That should now work as per hashicorp/hcl#6. (Terraform uses HCL for configuration).

@alekstorm
Copy link
Contributor

Couldn't we read the file and interpolate its contents as two separate passes?

@mitchellh
Copy link
Contributor

@alekstorm Only if you know the file beforehand, note later when I set the filename as another variable. Imagine this variable is maybe the result of a resource... it requires graph traversal in order to determine the value. At that point, Terraform might read the file and realize there is a graph cycle or something.

Right now Terraform is able to determine graph cycles 100% before any execution happens. It would be really unfortunate if we lost this property.

@mitchellh
Copy link
Contributor

(Whoops, meant @alekstorm, sorry Alex Gaynor if you got pulled into this)

@alekstorm
Copy link
Contributor

Ah, good point, thanks. Could we then distinguish between computed variables, like an id, and statically-known ones, like user variables or a name? Only statically-known variables would be allowed to be passed to the file() function.

This may be too much complexity for too little benefit, though.

@mitchellh
Copy link
Contributor

I think your last sentence nails it. Not only do I believe its too much complexity, but I think the limitation is non-intuitive and annoying. For a new user coming to TF, I'd be annoyed to get some weird error about "computed variable not allowed as file parameter." (It may be obvious to you or me that a DAG is being made and so OBVIOUSLY you can't have that, but I don't say its generally intuitive).

Therefore, if we use the explicit variable exposure approach above, I think we can get uniformity with the rest of the interpolation style while allowing variables to be interpolated within the files themselves.

It unfortunately has some complexity: named parameters aren't accepted in interpolations right now.

But that complexity is much less than anything else, I think.

@dgarstang
Copy link

We pass user data to new instances. The hostname is set by the user data. We need to be able to interpolate/template the hostname variable in the file passed to the user data. That's our use case.

@lamielle
Copy link
Contributor

I have the same use case as @dgarstang: I would love to be able to perform variable replacement in the user data (cloud-config) file passed to a CoreOS machine. For now, I'm simply embedding the file data in the terraform config. This seems less than ideal to me, but works for now.

@punnie
Copy link

punnie commented Dec 12, 2014

I have the same use case as both @dgarstang and @lamielle. Supporting this makes terrible sense given the presence of the "count" parameter.

Wondering if there's a workaround for this.

@MerlinDMC
Copy link
Contributor

I currently get around this by using a heredoc (like @pearkes did suggest) but gave the file provisioner the option to have a inline_script source to push to the instance which looks like:

  provisioner "file" {
    inline_script = true
    source = <<EOF
{
  "bundlesequence": [ "ensure_software_cassandra_installed" ],
  "personas": [ "is_cassandra_host" ],
  "packages": {
    "2014Q2": {
      "apache-cassandra": "2.1.0"
    }
  },
  "config": {
    "cassandra": {
      "cluster_name": "TestCluster",
      "seeds": "${join(",",sdc_instance.cassandra.*.network.1.address)}"
    }
  }
}
EOF
    destination = "/var/machine.json"
  }

@punnie
Copy link

punnie commented Dec 18, 2014

Thank you very much for the workaround @MerlinDMC. Much appreciated!

Unfortunately that doesn't solve my issue, as my goal is to populate whatever I'm sending in the user_data field with dynamic data, particularly the hostname that I need to further provision the machine using chef.

I get around this limitation by invoking a script in a generic user_data file that sets the hostname according to the value present in the AWS tag "Name" for that instance. It does work, but I don't like it very much.. 😉

@punnie
Copy link

punnie commented Dec 18, 2014

Nevermind my last comment, I'm a jackass. Had no idea terraform supported heredocs also in the user_data paramenter. 👍

@chipi
Copy link

chipi commented Jan 28, 2015

@MerlinDMC I am using latest TF build (v0.3.6) and seems that heredoc within for file provisioner is not supported since I get error on inline_script attribute and syntax error when applying heredoc to source attribute. What I am doing is specifying a file provisioner within aws_instance resource to copy over and interpolate a file.

resource "aws_instance" "name" {
  bla
  bla

  provisioner "file" {
        inline_script = true
        source = <<EOF
bla
bla
        EOF
        destination = "/etc/my.conf"
    }
}

Am I missing something on how to use heredoc?

@MerlinDMC
Copy link
Contributor

@chipi I did branch terraform and added functionality for that one :/

(This commit: MerlinDMC@a0f677e)

@mwitkow
Copy link

mwitkow commented Mar 2, 2015

The problem with heredocs is that if you want to use the same piece of inlined configuration in multiple places, you probable want to have them in a separate file and source that in. But then, you can't use parameters in them and you end up with code duplicaiton.

There's another Feature Request: gsub #1029, and it may allow to do something like:

${gsub("$MY_VAR", ${var.my}, file("../")}

Yes, it would require you to specify the parameters to be substituted manually manually but it would be explicit for the user that there's something fishy happening.

@mwitkow
Copy link

mwitkow commented Mar 2, 2015

Or maybe even piggy-backing on the gsub implementation, can we have an optional parameter to file that would take interpolations map?

e.g.:
${file("../myfile", {"variable_name": "${var.somevar}"})}

@mitchellh
Copy link
Contributor

@mwitkow-io That is the plan, we just don't support that syntax yet but that is what we'd like to do.

@tayzlor
Copy link

tayzlor commented Mar 19, 2015

👍

@apparentlymart
Copy link
Contributor

Rather than this being a simple function, could it instead be a resource so that it can participate normally in the dependency graph?

resource "string_from_template" "example" {
    template_file = "template.txt"
    vars {
        ip_addr = "${aws_instance.example.public_ip}"
    }
}

...then elsewhere in the config, use ${string_from_template.example.rendered} to get the result of rendering the template.

Unless I'm missing something, this allows both the filename and the variables to be dynamic by re-using the existing dependency management mechanisms, at the expense of a more verbose (but arguably more readable) usage syntax.

@josharian
Copy link
Contributor

@apparentlymart I started implementing your suggestion and have a basic working version. (It is missing a few things, such as the lookup function, but the basics seem to work.)

Having done so, I'm not sure it is the right approach. I'm new to terraform, though, so perhaps I'm doing it wrong.

The main problem I see with having templates be resources is that there is no way that I can see to sanity check the template at plan time. This is because interpolation might not be done, so things like the template filename might not be available. As a result, any errors in the template (syntax errors, missing keys, etc.) won't show up until execution time, which is pretty unfortunate.

Thoughts?

@apparentlymart
Copy link
Contributor

@josharian... That is fair. Seems true of many resources, though... If I create an AWS VPC and use it to create a subnet then the subnet id can't be checked until apply time, for example.

Seems like the plan tells you what terraform will attempt, not what will succeed.

With that said, I am pretty new to this tool to so I wouldn't take my opinion too seriously. I'd love to hear from a hashicorp staffer on this.

@josharian
Copy link
Contributor

I'd love to hear from a hashicorp staffer on this.

Me too. I'd like to do the work to get this issue fixed, but I want to make sure I go down the right path. @mitchellh, care to weigh in about your preferred direction? Thanks!

@mitchellh
Copy link
Contributor

@josharian You're right, but the template can also fail at runtime. The partial state nature of Terraform will allow this to happen pretty gracefully. I think its the best we can do.

josharian added a commit to josharian/terraform that referenced this issue May 2, 2015
@apparentlymart
Copy link
Contributor

In case anyone finds this years later: I've now gone from being "pretty new to this tool" to working on is as my day job 🙃 and also in Terraform 0.12 we finally had all the necessary parts in place to make this be a function like originally proposed: templatefile.

The template_file data source is still there, but is no longer the primary way to render templates from separate files.

@ghost
Copy link

ghost commented May 2, 2020

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@ghost ghost locked and limited conversation to collaborators May 2, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.