-
Notifications
You must be signed in to change notification settings - Fork 4.5k
Description
Containers started through AWS Batch can't do even the most basic communication with other AWS services due to missing meaningful defaults.
Use Case
I've created an AWS Batch compute environment, queue and job definition. The job definition is very trivial - a NodeJS app that writes a row to DynamoDB. However, this trivial example fails with an error: UnhandledPromiseRejectionWarning: ConfigError: Missing region in config. That's because the AWS Batch container does not pass information about the current AWS region to the container, the same way it would pass it to the Lambda runtime, so the NodeJS app and the aws-sdk inside it, fails to connect to the DynamoDB service.
Proposed Solution
I usually solve this by passing the AWS_REGION as an environment variable to the container definition:
new JobDefinition(this, 'app1', {
container: {
image: ContainerImage.fromAsset('./../apps/app1'),
environment: {
AWS_REGION: Stack.of(this).region,
},
},
});This seems like something that can be embedded inside the JobDefinition construct to make life easy for developers that use this construct.
- 👋 I may be able to implement this feature request
-
⚠️ This feature might incur a breaking change
This is a 🚀 Feature Request