Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
In #2514 I updated the lockfile to get latest package versions, and introduced some minor compatibility fixes. However, something in the newer versions of packages causes problems for pythons >= 3.10. Specifically, poetry's resolver ends up giving us:
numpy == 2.1.3
pandas == 2.0.3
Together this leads to
ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject
.From what I gather, this is due to a change from
numpy == 2.0.0
. The first version ofpandas
compatible withnumpy
s of this major version is pandas 2.2.2. However poetry's resolver is not aware of this fact, aspandas
only has lower version constraints (for some versions ofpandas
). For instancepandas == 2.0.3
does not cap thenumpy
version, and so as far as the resolver is concerned the above pair of packages are perfectly happy together.There is no way (afaict) to specify dependent constraints between two packages to restrict this sort of thing. It is also worth noting that with a standard
pip install
this doesn't seem to be an issue - possibly this is occurring due to the combination of packages we have in dev. At any rate, I am reluctant to include any additional hard constraints on our requirements, as this would be unnecessarily limiting for users, as there are no actual direct breaks for us. Anyone downstream encountering this issue in their own environment can just include appropriate constraints themselves.Therefore I think the best approach is to introduce a hard constraint in the dev dependencies only. That means for devs + in CI, for
pandas
we have a minimum version of2.2.2
whenever python is>=3.10
. This means we should once again get a valid set of dependencies, but we don't have to impact users.