-
Notifications
You must be signed in to change notification settings - Fork 4.4k
[aws-eks] Enable Control Plane logs in EKS cluster #4159
Description
Use Case
Enabling Control Plane logging in EKS cluster is only possible by calling EKS API after cluster is created. Doing it in CDK requires to create Custom Resource with code that calls the API. It would be nice to have it as an argument for creating EKS cluster from CDK.
Proposed Solution
Since EKS is created from python lambda when kubectlEnabled flag is enabled there is a simple way to create the EKS cluster with logging enabled. Currently the lambda code uses boto3 method eks.create_cluster() where we can pass arguments to enable logging on created cluster. (https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/eks.html#EKS.Client.create_cluster).
The lambda uses config as an argument for this method :
| resp = eks.create_cluster(**config) |
The config is passed as a properties of custom resource and is created here:
aws-cdk/packages/@aws-cdk/aws-eks/lib/cluster.ts
Lines 364 to 379 in c3b3c93
| const clusterProps: CfnClusterProps = { | |
| name: this.physicalName, | |
| roleArn: this.role.roleArn, | |
| version: props.version, | |
| resourcesVpcConfig: { | |
| securityGroupIds: [securityGroup.securityGroupId], | |
| subnetIds | |
| } | |
| }; | |
| let resource; | |
| this.kubectlEnabled = props.kubectlEnabled === undefined ? true : props.kubectlEnabled; | |
| if (this.kubectlEnabled) { | |
| resource = new ClusterResource(this, 'Resource', clusterProps); | |
| this._defaultMastersRole = resource.creationRole; | |
| } else { |
So I suggest to expose a way to include logging properties in the config so it should be passed to eks.create_cluster() method without any more changes. That should result in enabling logging on newly created EKS cluster.
This is a 🚀 Feature Request