This project serves as blueprint to get your python code running in AWS Lambda, deployed by the serverless framework, and monitored and managed by CloudReactor. See a summary of the benefits of these technologies. This project is designed with best practices and smart defaults in mind, but also to be customizable.
It has these features built-in:
- Sets up Tasks to be monitored and managed by CloudReactor
- Reads secrets from AWS Secrets Manager
- Uses pip-tools to manage only top-level python library dependencies
- Uses pytest (automated tests), pylint (static code analysis), mypy (static type checking), and pip-audit (security vulnerability checking) for quality control
First, setup AWS and CloudReactor by following the pre-requisites. You'll be granting CloudReactor permission to start Tasks on your behalf, creating API keys, and optionally creating an IAM user/role that has permission to deploy your Tasks.
Next, you'll need to get this project's source code onto a filesystem where you can make changes. First fork the project, then clone your project:
git clone https://github.com/YourOrg/cloudreactor-python-lambda-quickstart.git
These tools are necessary for local development and deployment:
- pyenv and pyenv-virtualenv are used for local python development
- pip-tools is used to compile top-level python library dependencies into requirements.txt files for python
- nvm is used to provide the NodeJS runtime used by serverless for deployment
After installing the required tools above, create the virtual environment:
pyenv virtualenv 3.9.13 cloudreactor-python-lambda-quickstart-dev
pyenv activate cloudreactor-python-lambda-quickstart-dev
python install -r requirements.txt -r dev-requirements.txt
To run locally, first copy config.localdev.json.sample to
config.localdev.json and update the API key value to one created in
CloudReactor that has the Developer access level, either unscoped, or scoped
to the Run Environment you want your Task to appear in.
Before running, ensure you are using the correct virtualenv:
pyenv activate cloudreactor-python-lambda-quickstart-dev
Finally, to run:
python -m functions.handler
To run type-checking with mypy:
mypy -m functions
To run source-code static analysis:
pylint functions
To check for security vulnerabilities in the python libraries:
python -m pip_audit -r requirements.txt
If you don't have NodeJS 14.17.5 installed already in NVM:
nvm install 14.17.5
Then,
./deploy.sh <Run Environment name>
This project is setup to deploy with a GitHub Action when a commit is pushed to the master branch. It requires these GitHub secrets to be set:
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
- AWS_REGION
- SERVERLESS_CONFIG_YAML - set to YAML text, use a modified version of
deploy_config/serverless-config-sample.yml, withprofileset tonull
See serverless's guide to setting up access keys to be used for deployment.
This project uses serverless-python-requirements to include python libraries.
To build the requirements.txt file, the project uses
pip-tools so that we only have to
manage top-level python library dependencies in requirements.in. To update the
compiled requirements.txt file:
pip-compile --allow-unsafe --generate-hashes --output-file=requirements.txt requirements.in
The development environment has additional dependencies found in
dev-requirements.in. To update the copiled dev-requirements.txt file:
pip-compile --allow-unsafe --generate-hashes --output-file=dev-requirements.txt dev-requirements.in
Then to install the libraries:
pip install -r requirements.txt -r dev-requirements.txt
This project is setup to read secrets from AWS Secrets Manager, using proc_wrapper. It is possible to populate Secrets Manager using Terraform, with the instructions below:
Requirements:
- Terraform (tested with version 1.2.5)
First, ensure the S3 bucket example-projects-terraform-remote-state exists
and can be accessed the account used to run terraform.
In the terraform directory:
export AWS_PROFILE=<your AWS profile name>
export AWS_REGION=<AWS region in which your function will execute>
terraform init
terraform workspace new <stage>
terraform plan -out plan.out
terraform apply plan.out
Alternatively, you can populate the secrets manually in
the AWS Console. The name of the secret should be staging/cloudreactor-python-lambda-quickstart/secrets.json where
staging should be replaced with the stage name.