-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Warn if newly introduced upper version bounds lead poetry to install older version of a package #6884
Comments
I'm not sure there's much that we can do that is productive here. Poetry has encouraged the use of upper bounds and semver, which many have argued at length may be more harmful than helpful to the ecosystem. In any case, I think the issue here is discovering and selecting new versions using This has never been the workflow that anyone I work with who uses Poetry regularly prefers -- for applications, typically, we use some automated tooling to notify us of new versions, or look at the release/project page for what we depend on. Then, if we decide we want to ensure we have a new minimum version to depend on, we manually update the constraint and ensure that version is used. Poetry's solver tells us if this is not possible; if upstream set requirements too strictly, this typically results in an open bug/a temporary fork if needed. For libraries, the opposite is true -- mostly we (being myself and those whose workflows I am familiar with) keep the constraints as wide as possible, only bumping them when we know for sure there is an incompatibility. Here, Poetry's defaults are very unhelpful; but we've generally expected library authors to be sophisticated enough to realize they should design their constraints appropriately for their end users. I don't think there's anything productive to be done here with @henryiii has argued extensively for a change to both defaults and semantics, and personally I'm interested in exploring it, as Poetry's opinionated defaults were intended for the original use case of locking applications and not considered in light of Poetry's increasing use as a tool for building libraries, or software that is more than a consumer of Python packages. In any case, I'd be interested to hear if you think I missed something, but I mostly think that this is more evidence we should consider adding nudges for users (e.g. different or context-sensitive defaults, the ability to decouple Requires-Python from checks in the build system, more of a focus on features needed by those who build libraries); I don't think teaching update to nag the user that there is a later version than the one they have locked makes sense. |
#1394 is somewhat relevant, asking that when poetry downgrades a dependency it should say "downgrading" rather than "upgrading". That feels like it should be straighforward, a pull request is surely welcome. That doesn't help you to realise that you have had an outdated dependency ever since it was first added: but Special-casing the case where some package is excluded exactly because it has python-requires and its predecessor doesn't seems too much. |
I can understand that this feature might be too niche for the effort. I think the best workaround for direct dependencies is adding them through the poetry add command, which seems to pin the current version and will give an error message if there is an incompatibility. This does at least fix the issue like with old pyinstaller versions. |
That is the intention. Also, any dependency you care about version-wise is a direct dependency, for what it is worth. Even if you don't directly import it, if you care about a package's resolution at all, it should be a direct dependency. |
Requires-python is special; the field was designed only to allow support to be dropped for a version of Python - that's what the current solvers assume. It was not designed to allow a user to block a future updates of Python. I believe it would never have been added if that was the proposed usage. Currently pip and poetry "back solve", that is, look through past versions of a library if this doesn't match, trying to find an older version that does match. This is (I would argue) never correct in the "capping" case, you are never more likely to support a newer version of Python you have never seen in your past versions. The solver should either a) ignore upper caps on python-requires entirely, or b) immediately error out. (Technically, I think backsolving for any version cap is also usually incorrect, but it complicates solvers which are already complicated). https://discuss.python.org/t/requires-python-upper-limits never really got a conclusion, but that's partially because I need to implement a feature in packaging that makes this easier to work with / detect. Even though Poetry lists Python in the dependencies, it's also special there too - it's the only dependency Poetry can't solve for (it's fixed), and when making a lock file, it's the only one it's solving for a range, rather than picking a version and adding a hash. The lock file produced has a valid range (which is why adding a Python cap forces all Poetry projects that depend on that library to also include the same cap, even if they disagree with capping). The python-requires metadata is forced to be the same value as the poetry locking environment solver range, there's only one setting. I've been rather hoping adding PEP 621 metadata will provide two ways to set this, and if both are set then tool.poetry setting would be the locking solver setting, and the PEP 621 one would be the metadata setting. |
I agree that there are likely plenty of cases where an older version is unrestricted with respect to requires-python only because the maintainer hadn't yet added that information; and not because they actually intended to declare support for all pythons. I also agree that the use of upper bounds is at least contentious. However we are where we are: and I am more than sceptical about having solvers guess what was intended, rather than using the provided metadata and playing by the rules. (Doubly so in a case like this, where making that guess will break previously working installations.) Don't hate the player, hate the game: pursuing this thread at the ecosystem level seems the right thing to me. I agree with much of your second paragraph but it feels somewhat tangential to the issue as raised and probably sufficiently covered elsewhere eg #5823 |
I pretty much agree with everyone here; I think #5823 deserves particular consideration along with rethinking the aggressive encouragement of semver (and again, here it was thought that defaults are mostly harmless and will be disregarded by users who they do not work for; however, it has become clear over time that most users will not take extra steps to research and understands the specifics and we need to provide a better way for users to discover what version ranges make sense for their use case). I also think that this is something that needs to be solved in the ecosystem and that Poetry can take steps to correct if we think we've taken missteps and contributed to the problem; I don't see adding a warning that a more restrictive version range could allow a higher package range is particularly useful. I still think that |
As the author of #5823 I'd like to note that I also considered writing a feature request similar to this (6884). The reason I didn't was that I couldn't find a way it might work. The closest I got was for the solver to run twice: once normally and once replacing the project's Obviously a feature request to suggest running the solver twice was never going to fly. Perhaps someone might consider a new command (perhaps a plugin) to perform this analysis. But integrating it into
This is very valuable learning for us all! I get some pushback when I tell people we shouldn't just use the defaults or copy-paste from documentation. People try to suggest that I'm wrong because "I disagree with the recommended approach". This happens even when the authors didn't think they were recommending an approach, just offering an example. To that end poetry's documentated use of semver FYI I wrote 5823 as a knock on from spending half a day diagnosing why our build suddenly ran very slow after upgrading Ubuntu docker version... I fell down a rabbit hole and discovered we were using a heavily antiquated version of scipi (that version wasn't built for the new Ubuntu's lib C and suddenly the build spent 15 minutes per build GCC compiling scipi). It is deeply frustrating that poetry directs you to set python version constraint without making the side effects obvious. Unfortunelty there's no good way around it. |
I'm a bit confused here about what exactly the recommended (even at the ecosystem level, not only poetry level) behavior for
is. I came to this bug when someone asked me why we don't support Python 3.12 in our package (not a library). Here's what happened to us: I had
. I wouldn't originally have added that upper bound if it wasn't necessary; however, we had at one point in history used a specific Pytorch version from a whl source (#6409). Those whls are Python interpreter version specific, so I believe there was no other option but to pin to 3.10 (or some other version). It turns out we do not need pytorch at all anymore, so removed that dependency and started looking for the highest upper bound of
Now I get this:
I don't quite understand if aiofile is here doing the right thing or the wrong thing by declaring an upper bound of 4, but let's accommodate as the FAQ suggests and change to
. Now I get (sorry, I don't 100% understand the interplay between 'upgrade' and 'install --sync', so I post the output of both):
Ok, so we got some ancient versions of numpy, scipy and pandas-stubs because these older versions did not declare upper bounds while newer ones do. Setting `python = >=3.10, <3.13", at this time, causes the behavior I want. I assume (perhaps incorrectly) that declaring upper bounds in some of these libraries is the Right Thing, which is why they started to do it once they realized so. What I don't quite understand is if there's some suggestion of what to do about the older versions that (incorrectly) don't. Presumably the missing upper bound was a bug, and that bug was fixed. Now it seems very counterproductive that the bug fix causes the buggy versions to be installed when I relax the python version dependencies. From the ecosystem perspective, is there a good (or even a reasonable) answer to this? Rerelease ancient packages with stricter version dependencies? Or is a bad choice of constraints in one release something that you just have to live with forever? (If there's discussions of this issue elsewhere, I'd love some pointers!) |
The discussion is here: https://discuss.python.org/t/requires-python-upper-limits/12663 NumPy should not be making an upper cap, this breaks locking solvers that don't have separate settings for package metadata and the locking solver. As covered in the discussion above, the Python ecosystem doesn't support upper caps correctly. Ideally, you should be able to tell Poetry's locking solver you only want to solve for <3.13, but still keep your public metadata uncapped. It doesn't let you do that; Poetry forces both to be the same thing. I spent a lot of time convincing the Numba developers to remove the upper cap, and replace it with a runtime error (with a nice message). They have been very happy with the results. I'm not sure I have the energy to convince the NumPy, SciPy, and Pandas developers of the same thing, especially since they started doing this recently. I bet it was added when Meson-Python was adopted, since meson-python doesn't have a custom Python allowing a custom error message when a Python version is unsupported (but Meson should let you do this, I think?). Maybe this is a job for a Scientific-Python SPEC? |
This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
Feature Request
Many projects have upper version bounds in the form of
python = "^3.8,<3.11"
, for example. If a user wants to install them and has an unbounded version likepython = ^3.9
, they usually get the helpful error message that this is not possible and that they should limit their version as well. However, some libraries used to have unbounded version requirements and then added the upper version bound later. This leads the version resolver to install outdated versions.One reproducible example is creating a new project and having these dependencies specified:
This will install the old 4.5.1 version of pyinstaller, whereas the newest one is 5.6 or so. And the user does not know that anything went wrong, because there is no error message. Another example is that I tried loosening my version requirements from <3.11 to <3.12 and pandas-stubs was downgraded 3 minor versions, presumably due to the same reason.
Maybe someone can think of scenarios where this behaviour is desirable, I am not an expert, but I think in the vast majority of cases there should be at least a warning. (Or maybe this shouldn't be done at all? Does it occur that older package versions have better new Python support than newer package versions?)
The text was updated successfully, but these errors were encountered: