-
Notifications
You must be signed in to change notification settings - Fork 4.4k
(aws-eks): Cannot remove logging configuration after setting them #19898
Copy link
Copy link
Closed
Labels
@aws-cdk/aws-eksRelated to Amazon Elastic Kubernetes ServiceRelated to Amazon Elastic Kubernetes ServicebugThis issue is a bug.This issue is a bug.effort/mediumMedium work item – several days of effortMedium work item – several days of effortp1
Description
Describe the bug
Updating a cluster to remove cluster control plane logging cannot succeed. Also the rollback also fails with slightly different error.
Expected Behavior
The cluster should be updated or at least the rollback should complete.
Current Behavior
It fails with "The type for cluster update was not provided." error. Also the rollback also fails with slightly different error: "No changes needed for the logging config provided."
Reproduction Steps
- Create and deploy an empty cluster:
export class EksClusterStack extends cdk.Stack {
constructor(scope: Construct, id: string, props: EksClusterStackProps) {
super(scope, id, props);
// Cluster /////////////////////////////////////////////////////////////////
const clusterAdminRole = new iam.Role(this, "ClusterAdminRole", {
assumedBy: new iam.AccountRootPrincipal(),
});
const vpc = ec2.Vpc.fromLookup(this, "MainVpc", { vpcId: props.vpcId });
this.cluster = new eks.Cluster(this, "EksCluster", {
vpc: vpc,
vpcSubnets: [{ subnetType: ec2.SubnetType.PRIVATE_WITH_NAT }],
clusterName: `${id}`,
mastersRole: clusterAdminRole,
version: eks.KubernetesVersion.V1_22,
kubectlLayer: new lambda.LayerVersion(this, "KubectlLayer", {
code: lambda.Code.fromAsset(path.join(__dirname, "layers", "kubectl.zip")),
}),
});
}
}- Change and deploy the cluster stack and add logging configuration:
clusterLogging: [
eks.ClusterLoggingTypes.API,
eks.ClusterLoggingTypes.AUDIT,
eks.ClusterLoggingTypes.AUTHENTICATOR,
eks.ClusterLoggingTypes.CONTROLLER_MANAGER,
eks.ClusterLoggingTypes.SCHEDULER,
],- Now remove the above entry to make it like the original setup and deploy it (or you can just remove some and keep some). The
Custom::AWSCDK-EKS-Clusterresource fails to update with the following error:
Received response status [FAILED] from custom resource. Message returned: No changes needed for the logging config provided Logs: /aws/lambda/InfraMainCluster-awscdkawse-OnEventHandler at Object.extractError (/var/runtime/node_modules/aws-sdk/lib/protocol/json.js:52:27) at Request.extractError (/var/runtime/node_modules/aws-sdk/lib/protocol/rest_json.js:49:8) at Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:106:20) at Request.emit (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:78:10) at Request.emit (/var/runtime/node_modules/aws-sdk/lib/request.js:686:14) at Request.transition (/var/runtime/node_modules/aws-sdk/lib/request.js:22:10) at AcceptorStateMachine.runTo (/var/runtime/node_modules/aws-sdk/lib/state_machine.js:14:12) at /var/runtime/node_modules/aws-sdk/lib/state_machine.js:26:10 at Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:38:9) at Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:688:12)
- Try to complete the rollback, but it'll fails with slightly different error.
Possible Solution
🤷♂️
Additional Information/Context
No response
CDK CLI Version
2.20.0 (build 738ef49)
Framework Version
2.20.0
Node.js Version
v16.13.0
OS
Darwin Version 21.4.0
Language
Typescript
Language Version
Version 3.9.10
Other information
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
@aws-cdk/aws-eksRelated to Amazon Elastic Kubernetes ServiceRelated to Amazon Elastic Kubernetes ServicebugThis issue is a bug.This issue is a bug.effort/mediumMedium work item – several days of effortMedium work item – several days of effortp1