Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[aws-eks] fargate profile issue in aws-auth config map #7981

Closed
eduardomourar opened this issue May 14, 2020 · 3 comments · Fixed by #8447
Closed

[aws-eks] fargate profile issue in aws-auth config map #7981

eduardomourar opened this issue May 14, 2020 · 3 comments · Fixed by #8447
Assignees
Labels
@aws-cdk/aws-eks Related to Amazon Elastic Kubernetes Service bug This issue is a bug.

Comments

@eduardomourar
Copy link
Contributor

Whenever you use awsAuth.addRoleMapping(), the aws-auth config map is being replaced instead of appended within the EKS cluster.

Reproduction Steps

  • Create a cluster:
    const cluster = new eks.Cluster(this, 'EksFargate', {
      clusterName: `${prefix}-eks`,
      mastersRole: clusterAdmin,
      vpc,
      defaultCapacity: 0,
      coreDnsComputeType: eks.CoreDnsComputeType.FARGATE,
      kubectlEnabled: true,
    });

    const fargateProfile: eks.FargateProfile = cluster.addFargateProfile('FargateProfile', {
      fargateProfileName: `${prefix}`,
      selectors: [
        { namespace: 'default' },
        { namespace: 'kube-system' },
      ],
      podExecutionRole: fargateProfileRole,
      subnetSelection: { subnets: vpc.privateSubnets },
    });
  • You run kubectl edit cm aws-auth -n kube-system -o yaml:
  mapRoles: |
    - groups:
      - system:bootstrappers
      - system:nodes
      - system:node-proxier
      rolearn: arn:aws:iam::123456789012:role/MyFargatePodExecutionRole
      username: system:node:{{SessionName}}
  • In a later CDK deployment:
    const clusterUser = iam.Role.fromRoleArn(this, 'ClusterUser',
      `arn:aws:iam::${this.account}:role/ClusterUser`
    );
    cluster.awsAuth.addRoleMapping(clusterUser, {
      username: 'cluster-user-viewer',
      groups: [],
    });
  • Now the fargate profile execution role in not part of your aws-auth config map anymore.

Error Log

fargate-scheduler Misconfigured Fargate Profile: fargate profile <PROFILE_NAME> blocked for new launches due to: Pod execution role is not found in auth config or does not have all required permissions for launching fargate pods.

Environment

  • CLI Version : 1.38.0
  • Framework Version: 1.38.0
  • OS : MacOS
  • Language : TypeScript

Other

Workaround:

    cluster.awsAuth.addRoleMapping(fargateProfileRole, {
      username: 'system:node:{{SessionName}}',
      groups: [ 'system:bootstrappers', 'system:nodes', 'system:node-proxier' ],
    });

This is 🐛 Bug Report

@eduardomourar eduardomourar added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels May 14, 2020
@SomayaB SomayaB added the @aws-cdk/aws-eks Related to Amazon Elastic Kubernetes Service label May 18, 2020
@eladb
Copy link
Contributor

eladb commented May 18, 2020

Can you please paste your entire CDK app (the relevant portion).

I am not sure I understand your workflow.

@eladb
Copy link
Contributor

eladb commented May 20, 2020

Waiting for a response. Closing for now.

@eladb eladb closed this as completed May 20, 2020
@eduardomourar
Copy link
Contributor Author

eduardomourar commented May 27, 2020

Please, reopen this issue because it is quite important. Relevant code can be found here: https://play-with-cdk.com?s=81c92ad7d69044732f052b8f9a76d3ef

@eladb eladb reopened this May 27, 2020
@SomayaB SomayaB removed the needs-triage This issue or PR still needs to be triaged. label Jun 2, 2020
eladb pushed a commit that referenced this issue Jun 9, 2020
When a Fargate Profile is added to the cluster, we need to make sure the aws-auth config map is updated from within the CDK app. EKS will do that behind the scenes if it's not done manually, but this means that it would be an out-of-band update of the config map and will be overridden by the CDK if the config map is updated manually.

Fixes #7981

BREAKING CHANGE: `cluster.awsAuth` can now return `undefined` if the cluster is not kubectl-enabled. This ensures that users take into account the fact that the cluster may not support aws-auth updates. In most cases, it's sufficient to just use this syntax `cluster.awsAuth?.addxxx` to conditionally update aws-auth.
@mergify mergify bot closed this as completed in #8447 Jun 9, 2020
mergify bot pushed a commit that referenced this issue Jun 9, 2020
When a Fargate Profile is added to the cluster, we need to make sure the aws-auth config map is updated from within the CDK app. EKS will do that behind the scenes if it's not done manually, but this means that it would be an out-of-band update of the config map and will be overridden by the CDK if the config map is updated manually.

Fixes #7981



----

*By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
@iliapolo iliapolo changed the title EKS fargate profile issue in aws-auth config map [aws-eks] fargate profile issue in aws-auth config map Aug 16, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
@aws-cdk/aws-eks Related to Amazon Elastic Kubernetes Service bug This issue is a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants