Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EKS: Access Denied when using a KubernetesManifest or HelmChart #26979

Closed
cisox opened this issue Sep 1, 2023 · 3 comments
Closed

EKS: Access Denied when using a KubernetesManifest or HelmChart #26979

cisox opened this issue Sep 1, 2023 · 3 comments
Labels
@aws-cdk/aws-eks Related to Amazon Elastic Kubernetes Service bug This issue is a bug. needs-triage This issue or PR still needs to be triaged.

Comments

@cisox
Copy link

cisox commented Sep 1, 2023

Describe the bug

I've created a standard EKS cluster in CDK.

When I go to create either a HelmChart or KubernetesManifest in another stack, I run into an Access Denied error.

The manifest is one that I've been able to apply manually with kubectl without issue. Now that I'm applying it using CDK instead, I get this error. It looks like it's using nested stacks for that so I'm assuming there's some problem there.

Expected Behavior

The KuberenetesManifest gets applied to the EKS cluster.

Current Behavior

An error is thrown and the stack fails to deploy.

 ❌ Deployment failed: Error: The stack named BuildkitStack failed creation, it may need to be manually deleted from the AWS console: ROLLBACK_COMPLETE: Received response status [FAILED] from custom resource. Message returned: Error: b'\nAn error occurred (AccessDenied) when calling the AssumeRole operation: User: arn:aws:sts::XXXXXXXXXX:assumed-role/BuildkitStack-BuildkitSta-HandlerServiceRoleFCDC14-1HUWNNZSWMJKC/BuildkitStack-BuildkitStackCluster-Handler886CB40B-4UYDlWGIRA8J is not authorized to perform: sts:AssumeRole on resource: arn:aws:iam::XXXXXXXXXX:role/EKSStack-EKSClusterCreationRoleB865C9E8-1HYSLCW85CT8B\nUnable to connect to the server: getting credentials: exec: executable aws failed with exit code 255\n'

Reproduction Steps

Create an EKS stack:

const mastersRoleArn = 'arn:...';

const mastersRole = Role.fromRoleArn(
    this,
    'MastersRole',
    mastersRoleArn,
    {
        mutable: false,
        addGrantsToResources: true
    }
);

const cluster = new Cluster(this, 'EKSCluster', {
    vpc,
    clusterName: 'eks-cluster',
    version: KubernetesVersion.of('1.26'),
    kubectlLayer: new KubectlV26Layer(this, 'kubectl'),
    defaultCapacity: 2,
    defaultCapacityInstance: new InstanceType('m5.large'),
    mastersRole
});

then try to apply any manifest:

new KubernetesManifest(this, 'BuildkitManifest', {
    cluster,
    manifest: [
        {
            apiVersion: 'apps/v1',
            ...
        }
    ]
});

Possible Solution

No response

Additional Information/Context

I just upgraded to the newest version of aws-cdk but this was also happening on 2.87.0 (build 9fca790).

CDK CLI Version

2.93.0 (build 724bd01)

Framework Version

No response

Node.js Version

v18.16.1

OS

macOS Monterey 12.6

Language

Typescript

Language Version

5.0.4

Other information

No response

@cisox cisox added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Sep 1, 2023
@github-actions github-actions bot added the @aws-cdk/aws-eks Related to Amazon Elastic Kubernetes Service label Sep 1, 2023
@cisox
Copy link
Author

cisox commented Sep 1, 2023

Some additional info here is that I am using Fn.importValue to get values I need to do Cluster.fromClusterAttributes. I'm going to pass those into the stack in another way to see if imported values are the issue here.

@cisox
Copy link
Author

cisox commented Sep 1, 2023

I ended up just folding that stack into the main cluster stack and it worked fine. I'll chalk this up to imported values and nested stacks.

@cisox cisox closed this as completed Sep 1, 2023
@github-actions
Copy link

github-actions bot commented Sep 1, 2023

⚠️COMMENT VISIBILITY WARNING⚠️

Comments on closed issues are hard for our team to see.
If you need more assistance, please either tag a team member or open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
@aws-cdk/aws-eks Related to Amazon Elastic Kubernetes Service bug This issue is a bug. needs-triage This issue or PR still needs to be triaged.
Projects
None yet
Development

No branches or pull requests

1 participant