Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KMS: addAlias 1.3 ->1.4 upgrade error #3652

Closed
1 of 5 tasks
nmussy opened this issue Aug 14, 2019 · 13 comments · Fixed by #3659
Closed
1 of 5 tasks

KMS: addAlias 1.3 ->1.4 upgrade error #3652

nmussy opened this issue Aug 14, 2019 · 13 comments · Fixed by #3659
Labels
needs-triage This issue or PR still needs to be triaged.

Comments

@nmussy
Copy link
Contributor

nmussy commented Aug 14, 2019

Note: for support questions, please first reference our documentation, then use Stackoverflow. This repository's issues are intended for feature requests and bug reports.

  • I'm submitting a ...

    • 🪲 bug report
    • 🚀 feature request
    • 📚 construct library gap
    • ☎️ security issue or vulnerability => Please see policy
    • ❓ support request => Please see note at the top of this template.
  • What is the current behavior?
    If the current behavior is a 🪲bug🪲: Please provide the steps to reproduce
    Trying to redeploy a KMS Key alias create pre-1.4.0 with 1.4.0 causes the following error:

 1/4 | 3:59:07 PM | CREATE_FAILED        | AWS::KMS::Alias    | TestKey/Aliasalias--test (TestKeyAliasaliastest5D993A95) alias/test already exists in stack arn:aws:cloudformation:eu-west-3:340369862100:stack/KmsReproStack/a43d42b0-be9a-11e9-af31-0e27f6d248de
        new Alias (deploy\node_modules\@aws-cdk\aws-kms\lib\alias.ts:104:22)
        \_ Key.addAlias (deploy\node_modules\@aws-cdk\aws-kms\lib\key.ts:71:12)
        \_ new KmsReproStack (deploy\lib\kms-repro.ts:17:13)
        \_ Object.<anonymous> (deploy\bin\deploy.ts:20:1)
        \_ Module._compile (internal/modules/cjs/loader.js:816:30)
        \_ Module.m._compile (deploy\node_modules\ts-node\src\index.ts:473:23)
        \_ Module._extensions..js (internal/modules/cjs/loader.js:827:10)
        \_ Object.require.extensions.(anonymous function) [as .ts] (deploy\node_modules\ts-node\src\index.ts:476:12)
        \_ Module.load (internal/modules/cjs/loader.js:685:32)
        \_ Function.Module._load (internal/modules/cjs/loader.js:620:12)
        \_ Function.Module.runMain (internal/modules/cjs/loader.js:877:12)
        \_ Object.<anonymous> (deploy\node_modules\ts-node\src\bin.ts:158:12)
        \_ Module._compile (internal/modules/cjs/loader.js:816:30)
        \_ Object.Module._extensions..js (internal/modules/cjs/loader.js:827:10)
        \_ Module.load (internal/modules/cjs/loader.js:685:32)
        \_ Function.Module._load (internal/modules/cjs/loader.js:620:12)
        \_ Function.Module.runMain (internal/modules/cjs/loader.js:877:12)
        \_ findNodeScript.then.existing (Local\nvs\node\11.15.0\x64\node_modules\npm\node_modules\libnpx\index.js:268:14)

CloudFormation attempts to create a new Alias because of the new logical ID because of #3596, but with the same name. This causes the alias name conflict.

  • What is the expected behavior (or behavior of feature suggested)?
    Not to crash. The easier solution I can see is to keep track of the aliases, and only use the new logical ID pattern if there are more than one.

  • What is the motivation / use case for changing the behavior or adding this feature?
    Deploy previously working stacks

  • Please tell us about your environment:

    • CDK CLI Version: 1.4.0
    • Module Version: 1.4.0
    • OS: Windows 10, most likely all
    • Language: TypeScript, most likely all
  • Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links for us to have context, eg. associated pull-request, stackoverflow, gitter, etc)

The issue was originally raised by @Visorgood in #3596

Reproduction steps:

  1. Install @aws-cdk/core@^1.3.0 @aws-cdk/aws-kms@^1.3.0
  2. Deploy the following stack:
import {Key} from '@aws-cdk/aws-kms';
import {Construct, Stack, StackProps} from '@aws-cdk/core';

export class KmsReproStack extends Stack {
    public constructor(
        private readonly scope: Construct,
        private readonly id: string,
        private readonly props: StackProps
    ) {
        super(scope, id, props);

        const key = new Key(this, 'TestKey');
        key.addAlias('alias/test');
    }
}
  1. Upgrade @aws-cdk/core@^1.4.0 @aws-cdk/aws-kms@^1.4.0
  2. cdk diff:
Stack KmsReproStack
Resources
[-] AWS::KMS::Alias TestKeyAlias90FC0DD1 destroy
[+] AWS::KMS::Alias TestKey/Aliasalias--test TestKeyAliasaliastest5D993A95
  1. Try to redeploy that stack
@nmussy nmussy added the needs-triage This issue or PR still needs to be triaged. label Aug 14, 2019
@McDoit
Copy link
Contributor

McDoit commented Aug 14, 2019

As a workaround, have you tried to override the logical id with your old id?
If you need to have the stack working right now I mean?

@nmussy
Copy link
Contributor Author

nmussy commented Aug 14, 2019

I'm going to make a PR to fix the issue, but in the meantime, @McDoit is correct, and you can do the following:

  1. Run npx cdk diff, and retrieve the existing alias' logical ID, e.g.
Resources
[-] AWS::KMS::Alias TestKeyAlias90FC0DD1 destroy
[+] AWS::KMS::Alias TestKey/Aliasalias--test TestKeyAliasaliastest5D993A95
  1. Override the alias' logical ID in 1.4.0:
import {Key, CfnAlias} from "@aws-cdk/aws-kms";
// ...

const key = new Key(this, 'TestKey');
const alias = key.addAlias('alias/test');

(alias.node.defaultChild as CfnAlias).overrideLogicalId('TestKeyAlias90FC0DD1');

Tagging @Visorgood if you're interested 👍

@Visorgood
Copy link

Hello @nmussy ! This is an amazing reaction from your side! Thanks a lot for looking into it and fixing it. I will consider your approach as temporal, and will be waiting for the next version with the fix included. Thanks a lot once again!

@eladb
Copy link
Contributor

eladb commented Aug 15, 2019

@nmussy thanks ❤️

@nmussy
Copy link
Contributor Author

nmussy commented Aug 15, 2019

No problem guys! Sorry again for the inconvenience. And thanks Elad for the quick review.

@defel
Copy link

defel commented Aug 19, 2019

Just fyi: I get this also when I stay on 1.3.0 after some time, so I doubt it's a regression from 1.3.0 to 1.4.0 - but it is possible to reproduce this issue with this method for sure 👍 👍 👍 And thanks for this awesome workaround :)

@nmussy
Copy link
Contributor Author

nmussy commented Aug 19, 2019

That's really odd @defel. Could you send the output of your cdk diff, and a reproduction snippet if possible?

@defel
Copy link

defel commented Aug 19, 2019

sure! See the following diff. There is little bit more happening, bit trying to remove the kms-alias and trying to add it again with a different logical-id was happening multiple times in the last days.

I am still on 1.3.0 (build bba9914)

Resources
[-] AWS::KMS::Key kmskeysdevkmskeydev6CA91956 orphan
[-] AWS::KMS::Alias kmskeysdevkmskeydevAlias18632944 destroy
[-] AWS::Lambda::Permission knkitapidevTboxCreateSessionHandlerApiPermissionPOSTapisession3880FA1B destroy
[-] AWS::Lambda::Permission knkitapidevTboxCreateSessionHandlerApiPermissionTestPOSTapisession3224DC34 destroy
[+] AWS::KMS::Key kms_key_dev kmskeydev272F532E
[+] AWS::KMS::Alias kms_key_dev/Aliasalias--knkit-kms-dev kmskeydevAliasaliasknkitkmsdev23C6EE23
[+] AWS::Lambda::Permission knkit-api-dev/TboxCreateSessionHandler/ApiPermission.knkitinfrapythondevknkitapidevopentokApi90CB435C.POST..api.session knkitapidevTboxCreateSessionHandlerApiPermissionknkitinfrapythondevknkitapidevopentokApi90CB435CPOSTapisession4B812AE6
[+] AWS::Lambda::Permission knkit-api-dev/TboxCreateSessionHandler/ApiPermission.Test.knkitinfrapythondevknkitapidevopentokApi90CB435C.POST..api.session knkitapidevTboxCreateSessionHandlerApiPermissionTestknkitinfrapythondevknkitapidevopentokApi90CB435CPOSTapisessionBC9B975C
[~] AWS::IAM::Policy knkit-api-dev/TboxCreateSessionHandler/ServiceRole/DefaultPolicy knkitapidevTboxCreateSessionHandlerServiceRoleDefaultPolicy362F70BF
 └─ [~] PolicyDocument
     └─ [~] .Statement:
         └─ @@ -4,7 +4,7 @@
            [ ] "Effect": "Allow",
            [ ] "Resource": {
            [ ]   "Fn::GetAtt": [
            [-]     "kmskeysdevkmskeydev6CA91956",
            [+]     "kmskeydev272F532E",
            [ ]     "Arn"
            [ ]   ]
            [ ] }

Sorry, reproduction snippet would be not so easy, but I will try something - brb in some minutes.

@nmussy
Copy link
Contributor Author

nmussy commented Aug 19, 2019

From what I can see, it's not the alias that's changing logical ID, but your key that's changing name and being orphaned, from 'kmskeysdevkmskeydev' to 'kmskeydev' (among other things.)

@defel
Copy link

defel commented Aug 19, 2019

I had the KMS-Key and the alias in a cdk-construct - could this cause the change of the logical-id of the kms-key? Because nothing else changed. I will try to reproduce this, but seems to be an other issue then.

@nmussy
Copy link
Contributor Author

nmussy commented Aug 19, 2019

I quickly tried to look how look how IDs are generated but didn't find it, sorry. Good luck!

defel pushed a commit to defel/cdk-kms-reproduce that referenced this issue Aug 19, 2019
@defel
Copy link

defel commented Aug 19, 2019

Ok thank you, I dont get this reproduced and it's a different issue - I have to look into a little bit more.

@McDoit
Copy link
Contributor

McDoit commented Aug 19, 2019

Moving things in and out of constructs will mess with their id's, they are created level for level
So if you did that, then it might be causing those issues

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-triage This issue or PR still needs to be triaged.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants