Contributing to OpEn
How can I contribute to OpEn?
Thank you for considering contributing to Optimization Engine (OpEn)!
OpEn is an open source project and welcomes contributions from the community.
You can contribute in several ways:
- Submit an issue:
Often bugs will go unnoticed by the core development team and certain
use cases and user needs will have evaded our attention.
Consider submitting an issue if:
- You would like to report a bug; please, use the provided template for reporting bugs. It is essential to give information about your system (OS, OpEn version) and outline a sequence of steps to reproduce the error. When possible, please provide a minimum working example
- You would like to request a new feature; please use the provided template
- You would like to propose modifications in OpEn's documentation, such as for some concepts to be better elucidated or a request for an additional example
- Share with us a success story on Discord
- Create a pull request (see below)
or, show us your love:
- Give us a star on gitub
- Spread the word on Twitter

I just have a question!
The easiest and quickest way to ask a question is to reach us on Discord or Gitter.
You may also consult the frequently asked questions.
Submitting issues
You may submit an issue regarding anything related to OpEn, such as:
- a bug
- insufficient/vague documentation
- request for a feature
- request for an example
You should, however, make sure that the same - or a very similar - issue is not already open. In that case, you may write a comment in an existing issue.
Contributing code or docs
In order to contribute code or documentation, you need to fork our github repository, make you modifications and submit a pull request. You should follow these rules:
- create one or more issues on github that will be associated with your changes
- take it from
master: fork OpEn and create a branch onmaster
git checkout -b fix/xyz master
- read the style guide below (and write unit/integration tests)
- create a pull request in which you need to explain the key changes
Coding style guide
Things to keep in mind:
- Code: intuitive structure and variable names, short atomic functions,
- Comments: help others better understand your code
- Docs: document all functions (even private ones)
- Tests: write comprehnsive, exhaustive tests
Rust
General guidelines: Read the Rust API guidelines and this API checklist
Naming convention: We follow the standard naming convention of Rust.
Documentation: We follow these guidelines. Everything should be documented.
Python
We follow this style guide and its naming convention
Website
This documentation is generated with Docusaurus - read a detailed guide here.
- All docs are in
docs/content/ - Blog entries are in
docs/website/blog/
To start the website locally (at http://localhost:3000/optimization-engine) change directory to docs/website and run yarn start. To update the website, execute ./publish.sh from there (you need to be a collaborator on github).
Using Git
When using Git, keep in mind the following guidelines:
- Create simple, atomic, commits
- Write comprehensive commit messages
- Work on a forked repository
- When you're done, submit a pull request to
alphaville/optimization-engine; it will be promptly delegated to a reviewer and we will contact you as soon as possible.
Branch master is protected and all pull requests need to be reviewed by a person
other than their proposer before they can be merged into master.
Versioning
This project consists of independent modules:
(i) the core Rust library,
(ii) the MATLAB interface,
(iii) the Python interface.
Each module has a different version number (X.Y.Z).
We use the SemVer standard - we quote from semver.org:
Given a version number MAJOR.MINOR.PATCH, increment the:
MAJORversion when you make incompatible API changes,MINORversion when you add functionality in a backwards-compatible manner, andPATCHversion when you make backwards-compatible bug fixes.
Additional labels for pre-release and build metadata are available as extensions to the MAJOR.MINOR.PATCH format.
We also keep a log of changes where we summarize the main changes since last version.
Releasing
Each time the major or minor number of the Rust library is updated, a new crate should be published on crates.io.
In order to release a new version make sure that you have done the following:
- open (Rust)
- opengen
- docker
Checklist:
- Updated CHANGELOG: bump version, write summary of changes
- Updated Cargo.toml: bump version
- Resolve all associated issues on GitHub
- Write new unit tests if necessary
- Update the API documentation
- Update the information on the website
- Merge into master once your pull request has been approved
Then, create a tag and push it...
git tag -a v0.10.0 -m "v0.10.0"
git push --tags
Lastly, update the docker image. This will have to be a new PR.
Checklist:
- Updated CHANGELOG: bump version, write summary of changes
- Updated VERSION: bump version
- Review
pyproject.toml - Resolve all associated issues on GitHub
- Write new unit tests if necessary
- Update the API documentation
- Update the information on the website
- Merge into master once your pull request has been approved
- Update the API docs
Then, create a tag and push it...
git tag -a opengen-0.10.0 -m "opengen-0.10.0"
git push --tags
Lastly, update the docker image. This will have to be a new PR.
Update the Dockerfile. You may need to bump the versions of open and opengen:
ARG OPENGEN_VERSION=0.10.0
ARG OPTIMIZATION_ENGINE_CRATE_VERSION=0.11.0
Update the CHANGELOG. Update the README file. Build, test, and push with
docker push alphaville/open:0.7.0
Update the website docs and the promo on the main page
To update the website, run
GIT_USER=alphaville \
CURRENT_BRANCH=master \
USE_SSH=true \
yarn deploy
from within docs/website/. Then, update the opengen API docs too;
just push a commit with message starting with [docit].
You can also issue a commit without git-add. Run
git commit -m '[docit] update api docs' --allow-empty
Running tests locally
If you are working on the Python interface (opengen) or the website/docs,
it is best to use a dedicated Python virtual environment.
Set up a virtual environment
From within python/, create and activate a virtual environment:
cd python
python3 -m venv venv
source venv/bin/activate
python -m pip install --upgrade pip
pip install -e '.[dev]'
If you plan to run the benchmark suite as well, install the extra dependency:
pip install "pytest-benchmark[histogram]"
Run the Rust tests
From the repository root, run:
cargo test
This will run all unit tests, including the examples in the docstrings. To run only the library unit tests, do:
cargo test --lib
If you want a faster compile-only check, you can also run:
cargo check
Run the Python and code-generation tests
From within python/, run the following tests after you activate venv.
The package's optional development dependencies include pytest, so the
recommended install command is:
pip install -e '.[dev]'
You can keep using the existing unittest commands:
# Activate venv first
python -W ignore test/test_constraints.py -v
python -W ignore test/test.py -v
python -W ignore test/test_ocp.py -v
or execute the same files with pytest:
pytest test/test_constraints.py
pytest test/test.py
pytest test/test_ocp.py
The ROS2 tests should normally be run from an environment where ROS2 is already
installed and configured, for example a dedicated micromamba environment.
They should not be assumed to run from the plain venv above unless that
environment also contains a working ROS2 installation together with ros2 and
colcon.
For example:
cd python
micromamba activate ros_env
pip install .
python -W ignore test/test_ros2.py -v
If ROS2 is not installed locally, you can still run the rest of the Python test suite.
Run linting and extra checks
From the repository root, it is also useful to run:
cargo clippy --all-targets
Before opening a pull request, please run the tests that are relevant to the part of the codebase you changed and make sure they pass locally.
Run the benchmarks
The Python benchmark suite uses pytest-benchmark. If you have not already
installed the development dependencies, do:
cd python
source venv/bin/activate
pip install -e '.[dev]'
pip install pytest-benchmark[histogram]
Before running the benchmarks, generate the benchmarkable optimizers:
python test/prepare_benchmarks.py
Then run the benchmark suite with pytest:
pytest test/benchmark_open.py --benchmark-only
To generate a histogram report, pass an output prefix:
pytest test/benchmark_open.py --benchmark-histogram=out
This will produce a file such as out.svg in the current directory.
Produce coverage reports
For Python coverage reports, activate your virtual environment in python/
and install coverage if needed:
cd python
source venv/bin/activate
pip install coverage
Then run the main Python test suite under coverage and print a summary:
coverage erase
coverage run --source=opengen test/test_constraints.py
coverage run -a --source=opengen test/test.py
coverage run -a --source=opengen test/test_ocp.py
coverage report -m
To generate an HTML report, run:
coverage html
This writes the report to python/htmlcov/index.html.
For Rust coverage reports, install cargo-llvm-cov once:
cargo install cargo-llvm-cov
Then, from rust/, run:
cargo llvm-cov --html
The HTML report will be written to rust/target/llvm-cov/html/index.html.