Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[aws-eks] Enable Control Plane logs in EKS cluster #4159

Closed
stefanolczak opened this issue Sep 19, 2019 · 16 comments · Fixed by #18112
Closed

[aws-eks] Enable Control Plane logs in EKS cluster #4159

stefanolczak opened this issue Sep 19, 2019 · 16 comments · Fixed by #18112
Labels
@aws-cdk/aws-eks Related to Amazon Elastic Kubernetes Service effort/medium Medium work item – several days of effort feature-request A feature should be added or improved. p1

Comments

@stefanolczak
Copy link

Use Case

Enabling Control Plane logging in EKS cluster is only possible by calling EKS API after cluster is created. Doing it in CDK requires to create Custom Resource with code that calls the API. It would be nice to have it as an argument for creating EKS cluster from CDK.

Proposed Solution

Since EKS is created from python lambda when kubectlEnabled flag is enabled there is a simple way to create the EKS cluster with logging enabled. Currently the lambda code uses boto3 method eks.create_cluster() where we can pass arguments to enable logging on created cluster. (https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/eks.html#EKS.Client.create_cluster).
The lambda uses config as an argument for this method :

resp = eks.create_cluster(**config)

The config is passed as a properties of custom resource and is created here:

const clusterProps: CfnClusterProps = {
name: this.physicalName,
roleArn: this.role.roleArn,
version: props.version,
resourcesVpcConfig: {
securityGroupIds: [securityGroup.securityGroupId],
subnetIds
}
};
let resource;
this.kubectlEnabled = props.kubectlEnabled === undefined ? true : props.kubectlEnabled;
if (this.kubectlEnabled) {
resource = new ClusterResource(this, 'Resource', clusterProps);
this._defaultMastersRole = resource.creationRole;
} else {

So I suggest to expose a way to include logging properties in the config so it should be passed to eks.create_cluster() method without any more changes. That should result in enabling logging on newly created EKS cluster.


This is a 🚀 Feature Request

@stefanolczak stefanolczak added feature-request A feature should be added or improved. needs-triage This issue or PR still needs to be triaged. labels Sep 19, 2019
@SomayaB SomayaB added the @aws-cdk/aws-eks Related to Amazon Elastic Kubernetes Service label Sep 19, 2019
@SomayaB SomayaB removed the needs-triage This issue or PR still needs to be triaged. label Oct 1, 2019
@SomayaB
Copy link
Contributor

SomayaB commented Oct 1, 2019

Hi @stefanolczak, thanks for submitting a feature request! We will update this issue when there is any progress.

@eladb eladb added the effort/medium Medium work item – several days of effort label Jan 23, 2020
@eladb eladb added the p1 label Mar 9, 2020
@yjw113080
Copy link

Any updates on this? Waiting for this feature as well.

@eladb
Copy link
Contributor

eladb commented Mar 18, 2020

This is not highly prioritized at the moment, but more than happy to take contributions.

@ccfife ccfife mentioned this issue Apr 8, 2020
19 tasks
@SomayaB SomayaB added the in-progress This issue is being actively worked on. label Jun 11, 2020
@eladb eladb added this to the EKS Developer Preview milestone Jun 24, 2020
@eladb eladb changed the title Enable Control Plane logs in EKS cluster [EKS Feature] Enable Control Plane logs in EKS cluster Jun 24, 2020
@eladb eladb removed this from the EKS Developer Preview milestone Jun 24, 2020
@eladb eladb added this to the EKS Dev Preview milestone Jul 22, 2020
@eladb eladb assigned iliapolo and unassigned eladb Aug 4, 2020
@iliapolo iliapolo removed this from the EKS Dev Preview milestone Aug 11, 2020
@iliapolo iliapolo removed the in-progress This issue is being actively worked on. label Aug 16, 2020
@iliapolo
Copy link
Contributor

Note that there is an abandoned PR for this: #8497

Consider resurrecting it once we pick this up again.

@iliapolo iliapolo changed the title [EKS Feature] Enable Control Plane logs in EKS cluster [aws-eks] Enable Control Plane logs in EKS cluster Aug 16, 2020
@iliapolo iliapolo added this to the [GA] @aws-cdk/aws-eks milestone Oct 4, 2020
@rameshmimit
Copy link

Any update on this feature?

@iliapolo iliapolo removed this from the [GA] @aws-cdk/aws-eks milestone Dec 1, 2020
@iliapolo
Copy link
Contributor

iliapolo commented Dec 1, 2020

@rameshmimit We are discussing this issue internally, we'll update here soon.

@micheal-hill
Copy link

In the meantime, what workarounds are available? It seems to me that click-ops or AWS cli are the alternatives, but neither is amicable to automation - is that correct? CLI will error if there's no change required.

@iliapolo
Copy link
Contributor

Since cluster logging can be updated post cluster creation, you can create a custom resource that updates that config.
The AWS Custom Resource should be a good fit for this.

@daisuke-yoshimoto
Copy link
Contributor

Any update on this?

@jtomaszewski
Copy link

jtomaszewski commented Jun 7, 2021

If anybody wants the workaround, here's code we wrote in our CDK to enable the logging for the cluster using AwsCustomResource.

import { FargateCluster } from "@aws-cdk/aws-eks";
import { Stack } from "@aws-cdk/core";
import {
  AwsCustomResource,
  AwsCustomResourcePolicy,
} from "@aws-cdk/custom-resources";

// Enables logs for the cluster.
//
// Taken from
// https://github.com/aws/aws-cdk/issues/4159#issuecomment-855625700
export function setupClusterLogging(
  stack: Stack,
  cluster: FargateCluster
): void {
  new AwsCustomResource(stack, "ClusterLogsEnabler", {
    policy: AwsCustomResourcePolicy.fromSdkCalls({
      resources: [`${cluster.clusterArn}/update-config`],
    }),
    onCreate: {
      physicalResourceId: { id: `${cluster.clusterArn}/LogsEnabler` },
      service: "EKS",
      action: "updateClusterConfig",
      region: stack.region,
      parameters: {
        name: cluster.clusterName,
        logging: {
          clusterLogging: [
            {
              enabled: true,
              types: [
                "api",
                "audit",
                "authenticator",
                "controllerManager",
                "scheduler",
              ],
            },
          ],
        },
      },
    },
    onDelete: {
      physicalResourceId: { id: `${cluster.clusterArn}/LogsEnabler` },
      service: "EKS",
      action: "updateClusterConfig",
      region: stack.region,
      parameters: {
        name: cluster.clusterName,
        logging: {
          clusterLogging: [
            {
              enabled: false,
              types: [
                "api",
                "audit",
                "authenticator",
                "controllerManager",
                "scheduler",
              ],
            },
          ],
        },
      },
    },
  });
}

@iliapolo iliapolo removed their assignment Jun 27, 2021
@jasonumiker
Copy link

So I did the above with a AwsCustomResource and it worked great until we wanted to create a FargateProfile. Even adding an explicit dependency from the FargateProfile on the logging custom resource didn't help - it fails with a "you can't update the logs while we are creating a Fargate Profile" error. Having fought with it all day I am now having to look at going lower level to a Lambda-backed custom resource to get access to a is_complete_handler or something.

@micheal-hill
Copy link

@jasonumiker in our case we're using Fargate, but the logging config is set after Fargate has been setup. We're also using eks.FargateCluster in case that's relevant.

@jasonumiker
Copy link

Thanks - yeah I eventually tried flipping the dependency and that seems to work 🤞

@SpringMT
Copy link

CloudFormation (NOT CDK) supports EKS control plane logging settings.
https://aws.amazon.com/about-aws/whats-new/2021/11/amazon-eks-cluster-configuration-aws-cloudformation/
How About CDK?

@bweigel
Copy link
Contributor

bweigel commented Nov 25, 2021

CloudFormation (NOT CDK) supports EKS control plane logging settings.

I thought this would be easy...just use an escape hatch, like this:

    const cfnCluster = cluster.node.defaultChild as eks.CfnCluster;
    cfnCluster.logging = {
      clusterLogging: {
        enabledTypes: [{ type: 'api' }, { type: 'audit' }, { type: 'authenticator' }, { type: 'controllerManager' }, { type: 'scheduler' }]
      }
    }

But alas, it seems like the CDK does not use the low-level CfnCluster (aka AWS::EKS::Cluster) to provision a cluster, but depends on a custom resource:

const resource = new CustomResource(this, 'Resource', {
resourceType: CLUSTER_RESOURCE_TYPE,
serviceToken: provider.serviceToken,
properties: {
// the structure of config needs to be that of 'aws.EKS.CreateClusterRequest' since its passed as is
// to the eks.createCluster sdk invocation.
Config: {
name: props.name,
version: props.version,
roleArn: props.roleArn,
encryptionConfig: props.encryptionConfig,
kubernetesNetworkConfig: props.kubernetesNetworkConfig,
resourcesVpcConfig: {
subnetIds: (props.resourcesVpcConfig as CfnCluster.ResourcesVpcConfigProperty).subnetIds,
securityGroupIds: (props.resourcesVpcConfig as CfnCluster.ResourcesVpcConfigProperty).securityGroupIds,
endpointPublicAccess: props.endpointPublicAccess,
endpointPrivateAccess: props.endpointPrivateAccess,
publicAccessCidrs: props.publicAccessCidrs,
},
},
AssumeRoleArn: this.adminRole.roleArn,
// IMPORTANT: increment this number when you add new attributes to the
// resource. Otherwise, CloudFormation will error with "Vendor response
// doesn't contain XXX key in object" (see #8276) by incrementing this
// number, you will effectively cause a "no-op update" to the cluster
// which will return the new set of attribute.
AttributesRevision: 2,
},
});

Long story short the quick "escape hatch" hack does not work 😭

choryuidentify added a commit to choryuidentify/aws-cdk that referenced this issue Dec 21, 2021
choryuidentify added a commit to choryuidentify/aws-cdk that referenced this issue Dec 21, 2021
choryuidentify added a commit to choryuidentify/aws-cdk that referenced this issue Dec 21, 2021
choryuidentify added a commit to choryuidentify/aws-cdk that referenced this issue Dec 21, 2021
choryuidentify added a commit to choryuidentify/aws-cdk that referenced this issue Jan 10, 2022
@mergify mergify bot closed this as completed in #18112 Jan 28, 2022
mergify bot pushed a commit that referenced this issue Jan 28, 2022
Fixes #4159

----

*By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
@github-actions
Copy link

⚠️COMMENT VISIBILITY WARNING⚠️

Comments on closed issues are hard for our team to see.
If you need more assistance, please either tag a team member or open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.

TikiTDO pushed a commit to TikiTDO/aws-cdk that referenced this issue Feb 21, 2022
Fixes aws#4159

----

*By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
@aws-cdk/aws-eks Related to Amazon Elastic Kubernetes Service effort/medium Medium work item – several days of effort feature-request A feature should be added or improved. p1
Projects
None yet