Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

provider/variable sharing in modules #18406

Closed
a14m opened this issue Jul 6, 2018 · 3 comments
Closed

provider/variable sharing in modules #18406

a14m opened this issue Jul 6, 2018 · 3 comments
Labels

Comments

@a14m
Copy link

a14m commented Jul 6, 2018

not to convolute the repo with a similar question... but is there a way of abstracting the provider for all the modules defined in a project.

for example, I have this project

├── modules
│   ├── RDS
│   └── VPC
└── stacks
    ├── production
    │   └── main.tf
    └── staging
        └── main.tf

and it works fine...
the problem is with the definition of modules

├── RDS
│   ├── README.md
│   ├── main.tf
│   ├── providers.tf
│   └── variables.tf
└── VPC
    ├── README.md
    ├── main.tf
    ├── providers.tf
    └── variables.tf

the provider in both of these modules are exactly the same

# providers.tf
provider "aws" {
  region = "${var.region}"
  version = "~> 1.26"
}

and the variables in each module are different but they all have the region variable.

# variables.tf
variable "region" {
  default     = "eu-central-1"
  description = "AWS region."
}
# other module dependent variables...

is there a way to define those bits of information on the modules level
so that I end up with something roughly like this

├── modules
│   ├── providers.tf  <<< include the *shared* provider definition block
│   ├── variables.tf  <<< include the *shared* region vaiable definition block
│   ├── RDS
│   │   ├── README.md
│   │   ├── main.tf
│   │   └── variables.tf
│   └── VPC
│       ├── README.md
│       ├── main.tf
│       └── variables.tf

Thanks in advance.

@apparentlymart
Copy link
Contributor

Hi @a14m!

In Terraform 0.11 we refined the ways in which providers pass between modules to better support this situation. As of that release, we recommend that most users put provider blocks only in the root module, with child modules then inheriting the root provider configurations or being passed them explicitly.

In your case then, the provider "aws" block would belong in stacks/production and stacks/staging (the two root modules, presumably) and there would be no provider blocks in the modules at all.

At present this pattern will hit the limitation discussed in #16835, preventing those child modules from declaring which version of the aws provider they need. The next major version of Terraform will include the new syntax discussed in that issue so that the modules will be able to declare their provider version requirements separately from the provider configurations themselves:

# (not yet implemented and details may change before release)

terraform {
  required_providers = {
    aws = "~> 1.26"
  }
}

There is more information on this in the Providers within Modules section of the documentation, including an optional syntax for explicitly passing providers into child modules for situations where the default inheritance behavior is insufficient.

I believe that this (except for the forthcoming change discussed in #16835) covers your use-case. Please let me know if I missed something!

@a14m
Copy link
Author

a14m commented Jul 6, 2018

Thanks a lot... this covers the use-case of provider configuration...
as for the shared variable part (I did some research after opening this -sorry for not thoroughly doing the research before opening this issue- ) and I found the following issues where it had been discussed in details and the conclusion
(TL;DR for those who don't want to go through the list/discussions ) sharing variables between modules against terraform clarity/explicity principle.

adding those issues here for future reference:

@a14m a14m closed this as completed Jul 6, 2018
@ghost
Copy link

ghost commented Apr 4, 2020

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@ghost ghost locked and limited conversation to collaborators Apr 4, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

2 participants