Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cdktf.out: Inconsistent stacks in cdktf.out when creating stacks dynamically #2929

Closed
1 task
suvhotta opened this issue Jun 12, 2023 · 9 comments
Closed
1 task
Labels
bug Something isn't working stale An issue or pull request that has not been updated in a very long time waiting-on-answer

Comments

@suvhotta
Copy link

Expected Behavior

I'm trying to create stacks dynamically by passing the stack ids as env variables in the runtime.
After I'm deploying 2 sets of infra, can see that the cdktf.out folder contains only the stacks with the names corresponding to the latest ones and the first set isn't present in the cdktf.out. This leads to inconsistency while doingcdktf list. Ideally all the sets of infra created should be part of cdktf.out directory.

Actual Behavior

Ideally the cdktf.out should contain all the 4 stacks: frontend_stack_dev, backend_stack_dev, frontend_stack_dev2, backend_stack_dev2.

Steps to Reproduce

main.py:

import os

from cdktf import App

from infra.backend import BackendStack
from infra.frontend import FrontendStack

app = App()

frontend_stack_id = os.environ.get('frontend_stack_id')
backend_stack_id = os.environ.get('backend_stack_id')

FrontendStack(app, frontend_stack_id)
BackendStack(app, backend_stack_id)

app.synth()

During deploying:

export frontend_stack_id=frontend_stack_dev
cdktf deploy frontend_stack_dev

export backend_stack_id=backend_stack_dev
cdktf deploy backend_stack_dev

Deploying the second set of infra:

export frontend_stack_id=frontend_stack_dev2
cdktf deploy frontend_stack_dev2

export backend_stack_id=backend_stack_dev2
cdktf deploy backend_stack_dev2

Versions

language: python 3.11
cdktf-cli: 0.15.5
cdktf: 0.15.5
node: v20.2.0

Providers

No response

Gist

No response

Possible Solutions

No response

Workarounds

No response

Anything Else?

No response

References

My use-case involves passing stack names during runtime and thus creating the stack-ids dynamically. I tried using Terraform Variables but since they can be only used within Stack/Construct scope, was unable to use them directly in main.py.
I went through:

Help Wanted

  • I'm interested in contributing a fix myself

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment
@suvhotta suvhotta added bug Something isn't working new Un-triaged issue labels Jun 12, 2023
@suvhotta suvhotta changed the title Inconsistent stacks in cdktf.out when creating stacks dynamically cdktf.out: Inconsistent stacks in cdktf.out when creating stacks dynamically Jun 13, 2023
@jsteinich
Copy link
Collaborator

cdktf works under the assumption that your code is the authoritative source. When a stack is removed from your code (by dynamically disabling), it assumes that you are intentionally removing it.
Take a look at #838 for more details and possible alternative paths.

@suvhotta
Copy link
Author

@jsteinich Thanks for routing me to the issue. Going through the same made sense of why its happening this way with stacks in cdktf.out. As you mentioned, I could see that when I remove a stack from the code dynamically by altering it the env variable associated, the same gets removed from the cdktf.out.

I'd a couple of further questions on the same:

  1. If a stack is not present in the cdktf.out folder, I could see that it is inconsistent in results when performing a cdktf list but what other repercussions could it have during cdktf synth, cdktf deploy or cdktf destroy cycles? Or any other sideeffects/issues it might have which I've missed out?
    JFYI: I've a Remote AWS S3 TF Backend.
  2. Is there any better way then to pass the stack ids dynamically during the cdktf synth, cdktf deploy or cdktf destroy cycles, other than os env variables?
    I've explored TF Vars, but they need to be defined within a stack/construct and not independently in main.py as my usecase above demonstrates.

@jsteinich
Copy link
Collaborator

  1. Any cross stack dependencies to missing stacks could fail. This is just an issue because cdktf checks for them; Terraform will happily reference the remote state.
  2. Environment variables are most likely your best option. Note that you can have a single value that controls a lookup into configuration data for further tuning. There is an open issue about adding other ways. See Specifying context values in the CDKTF CLI #2019

@suvhotta
Copy link
Author

Thanks for the clarity.
Just to be clear, when we talk of configuration data here, is it some configuration based on inputs passed in some file and the variable will then read the contents from that file? Or in some other way ?

@ansgarm
Copy link
Member

ansgarm commented Jun 23, 2023

Hi @suvhotta 👋

Has this been resolved or do you have any further questions?

@ansgarm ansgarm added waiting-on-answer and removed new Un-triaged issue labels Jun 23, 2023
@suvhotta
Copy link
Author

@ansgarm No I don't have any further queries.

@github-actions
Copy link
Contributor

Hi there! 👋 We haven't heard from you in 30 days and would like to know if the problem has been resolved or if you still need help. If we don't hear from you before then, I'll auto-close this issue in 30 days.

@github-actions github-actions bot added the stale An issue or pull request that has not been updated in a very long time label Jul 24, 2023
@ansgarm
Copy link
Member

ansgarm commented Jul 24, 2023

Cool! Let me close this then

@ansgarm ansgarm closed this as completed Jul 24, 2023
@github-actions
Copy link
Contributor

I'm going to lock this issue because it has been closed for 30 days. This helps our maintainers find and focus on the active issues. If you've found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Aug 24, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working stale An issue or pull request that has not been updated in a very long time waiting-on-answer
Projects
None yet
Development

No branches or pull requests

3 participants