It would be helpful to have a way to solve for dependencies with constraints on the dates that dependencies were released. For example, I would like to be able to solve for an environment based on what was available two years ago.
Possible interface
My first thought was to have a --target-date flag. Doing uv pip compile astropy --target-date="2015-04-01" would solve the environment based on what the pythoniverse was like on 2015 April 1, and exclude versions of packages that were released after that date. But, I am totally open to other possibilities!
Use cases
- Continuous integration testing — Packages typically run test suites in multiple environments (e.g., different Python versions, lowest allowed dependencies, different versions of NumPy, etc.). Having a way to solve for an environment on a particular date would be a straightforward way of generating different test environments. Instead of doing both the oldest and newest environments possible, it'd be straightforward to do some intermediate dates.
- SPEC 0 proposes a policy for the scientific pythoniverse that "support for core package dependencies be dropped 2 years after their initial release." The capability proposed here would make it a lot easier to figure out when to drop dependencies.
- Reproducing years-old scientific research. If we want to reproduce research from an old Jupyter notebook, it is helpful to have a way to recreate a Python environment from the time the notebook was written. The Jupyter notebook could have been a supplement to a research publication, or one of our own that we haven't touched in a few years.
- Creating an environment to run a package that is no longer being maintained. If a package was last updated in 2014, it's likely that it will not work with the newest versions of its dependencies. Being able to solve for an environment based on a particular target date could make it easier to get this old code running again.
Alternatives
- With regards to SPEC 0 compliance, I've usually looked up packages on PyPI and updated our
pyproject.toml manually. This suffices for small use cases but doesn't scale well.
- Creating a standalone package would be a possibility, but it would lead to a more fragmented packaging landscape, and I don't think it would be likely to happen.
- The
--resolution=lowest and --resolution=lowest-direct flags address some of the continuous integration testing use case above. (These flags are awesome, and the reason why I switched to using uv!)
- I thought about also having flags for a minimum date. A flag for a minimum date may be useful in conjunction with
--resolution=lowest to handle situations where a package doesn't list a minimum version of a dependency and we want to avoid ending up with v0.0.1. However, --resolution=lowest-direct would probably suffice for those situations.
Thank you so much for creating an awesome tool!
It would be helpful to have a way to solve for dependencies with constraints on the dates that dependencies were released. For example, I would like to be able to solve for an environment based on what was available two years ago.
Possible interface
My first thought was to have a
--target-dateflag. Doinguv pip compile astropy --target-date="2015-04-01"would solve the environment based on what the pythoniverse was like on 2015 April 1, and exclude versions of packages that were released after that date. But, I am totally open to other possibilities!Use cases
Alternatives
pyproject.tomlmanually. This suffices for small use cases but doesn't scale well.--resolution=lowestand--resolution=lowest-directflags address some of the continuous integration testing use case above. (These flags are awesome, and the reason why I switched to usinguv!)--resolution=lowestto handle situations where a package doesn't list a minimum version of a dependency and we want to avoid ending up withv0.0.1. However,--resolution=lowest-directwould probably suffice for those situations.Thank you so much for creating an awesome tool!