Skip to content

Conversation

@jokasimr
Copy link
Contributor

Fixes #463

pyproject.toml Outdated
scipp = ["scipp"]
all = ["scipp", "ipympl", "pythreejs", "mpltoolbox", "ipywidgets", "graphviz"]
scipp = ["scipp>=25.5.0"]
all = ["scipp>=25.5.0", "ipympl>0.9.4", "pythreejs>=2.4.1", "mpltoolbox>=24.6.0", "ipywidgets>=8.1.7", "graphviz>=0.20.3"]
Copy link
Member

@nvaytet nvaytet Jul 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to push back a little on this approach...
While I think it's a good idea to have lower bounds for the packages that we maintain ourselves, adding lower bounds everywhere feels like it could bring unforseen problems.

Or at least, we should go back and find the lowest version of every dependency that is compatible with our package, instead of just adding the version number that was solved by the last run of tox -e deps.

Say there is another package, not in our dependencies, but required by the conda env on a VISA instance (e.g. an analysis package) that has added an upper bound (from some reason) on e.g. ipympl<0.9.3 (or maybe numpy<2 would be a common one because packages may not yet have migrated).
So now, we have an environment which is unsolvable, while actually for us, having ipympl==0.9.1 would have been completely fine.

Generally, the less bounds we have the better?
Unless I am missing something? (I have to admit I did not follow all the discussion in the copier template)

I see it as a fine balance. If you can't fx import plopp because it fails because of a dependency, then we should absolutely have a lower bound. Same if you just can't plot anything.
If there are just a couple of glitches, or not all the features are fully working, then we should avoid the lower bound?

Of course, finding the correct lower bounds for all dependencies is difficult/a lot of work, as was mentioned by JL in the original issue...

Copy link
Contributor Author

@jokasimr jokasimr Jul 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please read this comment as it addresses some of your concerns here and might clarify a bit the reasoning behind the change scipp/copier_template#261 (comment)

In practice there are lower bounds on our dependencies, whether they are explicitly specified or not. Ideally we should find those true lower bounds and set them as the lower pins. But in practice that is cumbersome and doesn't yield much benefit because it's unlikely that anyone will try to use our packages with very old versions of dependencies.

Instead the strategy used here was to set all the dependencies to a version that is ~1 year old.
If you think that is insufficient we can of course go back longer, 2 years or maybe even 3.

The lower pins in this PR don't exclude numpy versions below 2.0 so that is not a concern. The lowest supported Numpy version with these pins is 1.26.4

instead of just adding the version number that was solved by the last run of tox -e deps.

Indeed that sounds like a lazy and bad approach to find the bounds. But that is not the criteria that was used to determine the lower bounds.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see it as a fine balance. If you can't fx import plopp because it fails because of a dependency, then we should absolutely have a lower bound. Same if you just can't plot anything.
If there are just a couple of glitches, or not all the features are fully working, then we should avoid the lower bound?

I disagree about the last point here. If a package can be installed, it should also fully work and not crash at runtime because of some dependency conflict. (Modulo optional features that need to be handled with a meaningful exception.) If this was not the case, users could not trust that our packages work. And we would also not get any indication that there is a problem until users come to us with cryptic error messages.

If we want to be backwards compatible, we should handle both new and old interfaces of a library explicitly. In general, we should be careful with using new features early.

And if a user fails to resolve an env but they still want to use the conflicting packages together, they can override transitive dependencies. (At least with uv. Not sure whether pixi supports it yet. Pip and conda don't, to my knowledge.)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the following problem

otherwise some of them resolve to something ridiculous like 0.0.1-alpha-use-at-your-own-peril.

is something that happens in practice, then I agree with the 1-2 year old approach 👍

As you say, ideally we should only have a lower bound on a dependency if we know for sure that our package won't work below that, but that is too much work to figure out for all deps.

And if a user fails to resolve an env but they still want to use the conflicting packages together, they can override transitive dependencies. (At least with uv. Not sure whether pixi supports it yet. Pip and conda don't, to my knowledge.)

I was just concerned this could become an issue on e.g. VISA (which uses conda).
For instance, I remember several packages pinning ipywidgets<8 for a while, because things had changed in version 8.

But maybe we can see how this goes, and if we can't solve environments on VISA, we can fix the pins on a case by case basis?
Alternatively, we could consider the VISA environments as part of our lower bound testing? E.g. try to solve a VISA conda env as a second test? Don't know how annoying that is (fx I am not sure the list of packages for the VISA envs are publicly accessible online?)

Copy link
Contributor Author

@jokasimr jokasimr Jul 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the following problem
>otherwise some of them resolve to something ridiculous like 0.0.1-alpha-use-at-your-own-peril.
is something that happens in practice, then I agree with the 1-2 year old approach

If you ask uv to find the earliest version of dependencies without setting any constraints on them that is what will happen.

If some package in VISA pins, for example, ipywidgets<8 and the tests in plopp don't pass with ipywidget<8 isn't that something we want to be aware of? That's the purpose of the "lower bounds" nightly run. That doesn't mean we necessarily have to update the pins to make the nightly pass, that can be done on a case by case basis. If the dependency is central to the package then it probably makes sense to update the pin, but sometimes it could also make sense to update our package to continue to support the older version of the dependency.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ESS-* envs on VISA uses the ess* packages as the main entry point to resolve the envs. So if we maintain a clean lower bounds on our releases of ess* packages we should be fine, unless some analysis package has an upper pin lower than our lower pin in ESS-*.

I agree it's tedious to find the lowest possible deps for all packages but ideally we pay this cost only once.

@jokasimr
Copy link
Contributor Author

I fiddled a bit with the pins and managed to relax some of them.
The matplotlib pin had to be increased to make the tests pass, I didn't notice that before because I didn't remove the uv.lock file in between test runs.

@jokasimr jokasimr merged commit 4b40990 into main Jul 17, 2025
4 checks passed
@jokasimr jokasimr deleted the lower-pins branch July 17, 2025 07:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add minimum Scipp requirement?

5 participants