-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Restore support for newlines in Summary #2893
Comments
Came here to report the same. Last week Was this a regression from #1390? |
Oh I guess this is intentional in v59.0.0: https://github.com/pypa/setuptools/blob/main/CHANGES.rst#v5900 Per #2870. |
As mentioned in #1390, this change was introduced intentionally in order to prevent these invalid inputs. It sounds like this issue only affects older packages installed with the latest Setuptools. These environments are pinning their requirements but not pinning Setuptools. As a result, the maintainers of these environments are asking Setuptools to carry the burden of the legacy introduced by pinning their requirements. I'd prefer for these projects to advance their dependencies to versions that no longer have the defective usage. But maybe there are other workarounds as well. Why can't these environments work around the issue by pinning Setuptools as well? Or find another workaround such as installing from wheels (built from older Setuptools)? For example, I was able to build the affected package thus:
I've tried to think of ways Setuptools could support these legacy cases but reject new cases, but I haven't been able to come up with a good approach. I'd really like to avoid leaving Setuptools indefinitely dependent on the legacy of older packages. |
Because every project has it's own pace to release and needs stability and depinning/upgrading those requirements imply extra maintenance and tests time and in case of paid work: costs which are not justified here. I agree that quality should be enforced on new uploaded releases but installation of older packages should and must be still possible wiithout any change. Seriously, guys, stripping eternally Another (bad solution i think as it will break integrity of uploaded dists and break custom hash verifications) and also do not work with alternative mirrors where artefacts wont be patched is to patch already uploaded artefacts to make them comply with new code. |
Not to also mention that newer releases of dependences are maybe not compatible with every stack and every stack may not be that easily upgradable, just for an Seriously, reconsider that non sense breaking change... |
I've created a plugin, setuptools_hacks.bypass_summary_newline that provides a workaround for the issue. Users encountering this issue can install this plugin alongside setuptools 59+ to bypass the error by stripping everything from the first newline. Demo:
This approach allows these select environments dependent on the broken packages to work around the breakage while also retaining the check for new packages. It also clearly pushes the legacy maintenance to the environment maintainers instead of Setuptools itself. |
Sorry but no, its not AT ALL an acceptable solution, please reopen. I don't want to figure out to push a new plugin in every build pipeline i have. What we need is build stability and reproducibility. Futherthemore, no one want to put a "legacy maintenance" on setuptools folks, we just want to have already and running fine build systems not to break for a |
I appreciate your enthusiasm and concern. I too have managed private pipelines that would be affected by upstream changes in Setuptools, and it was indeed a hassle when an incompatible change to Setuptools, intentional or unintentional, would break the builds. Some of our projects avoided this issue by pinning Setuptools and only bumping it periodically. And while I would discourage that approach, it does provide for more stability in exchange for toil to occasionally update. If reproducible builds is something you seek, then it's probably recommended not to leave Setuptools unpinned. Each release of Setuptools could potentially tweak the build, potentially changing the discovery of files or metadata format or a host of other behaviors. Although the goal is to maintain compatibility, sometimes backward-incompatible changes are necessary. Setuptools has recently made much more impactful backward-incompatible changes, such as dropping support for Python 2 and 2to3. These changes are necessary to move Setuptools incrementally toward a more usable, maintainable, and correct solution, built on PyPA standards. This change also falls into that category. Setuptools serves millions of downloads per day, and best I can tell, the fallout of this change has affected only a couple of environments. In my opinion, it doesn't make sense for Setuptools to adopt the indefinite burden of adapting incorrect inputs for a few cases. If this issue was widespread, I'd definitely like to consider it. But when the issue is with a few select environments, I would implore the maintainers of those environments to find a way to adapt. Setuptools has provided several workarounds:
If it can be shown that these workarounds are inadequate or impose a grievous burden or affect a significant portion of the users, we can reconsider this issue. |
…uptools Currently many builds are broken with the newest setuptools This is because of pypa/setuptools#2870 See e.g. pypa/setuptools#1390 pypa/setuptools#2895 pypa/setuptools#2893
As you see, many users are beginning to fall upon the change, @jaraco, you should reconsider reopening this bug. Here pinning setuptools down the road wont help in longterm maintenance, as sooner or later, the pin will have to be upgraded and no solution wont be here except adding a plugin in a stack, just for a newline that could be escaped... It's just crazy. Already uploaded dists should just be as easily installable as the first day they were uploaded, point, you just broken many -AT-THE-UPLOAD-TIME valid artefacts. What's wrong with you folks ? We can totally understand that things should now be enforced, but already --valid-artefacts-- should still be installable, specially when it comes only to a newline metadata enforcement... So let this only fail for newly uploaded artefacts and at most warn for other already uploaded artefacts. |
You can also see from this change, the package owners were unaware of the defects they introduced by copying their long description into the description including any newlines. Without having introduced the breaking change, they may have gone on happily ignoring the warnings indefinitely, continuing to publish packages with broken metadata. It occurred to me there's a possible fifth workaround:
Please limit your comments to the facts of the matter and your opinions, but avoid personal attacks. These won't be tolerated. Please read the PSF code of conduct and consider yourself warned. To be sure, us folks is basically me. Other maintainers have not yet weighed in.
I'm unaware of a way to do this. The validation that's happening is happening at the time that the metadata is written. That's the problem with these source artifacts - they're effectively indistinguishable from a package that was built from repo source. I suppose it's conceivable that Setuptools could somehow detect the presence of a PKG-INFO from a previous build and if it detects invalid metadata, bypass the failure. I guess another way could be to enumerate all of the package/version combinations that are affected (essentially an allow-list for already-published artifacts).
Thanks for collecting these (and linking them to increase visibility). I've read through these issues. It looks like a lot of them are issues that were only surfaced as a result of raising an error and probably would have continued to be broken except for this release. In that sense, the change is working as intended (creating the impetus to fix the broken usage). It would be a real shame to introduce all this disruption only to once again allow the broken behavior to pass. Thusfar, the only proposal to address the issue has been to back out the change and retain the warning behavior forever. In my opinion, that's unacceptable. I'd like to see a proposal that ends in Setuptools being able to enforce the standard. |
I stop you, nothing personal here. I do not understand the matter of risking to break stability for a bad prio Indeed, i we saw, already released versions are still totally broken for installation right now. I don't want to upgrade a (NON DIRECT) dependency of stable builds, i want to be able to install the release that was pinned, and battle tested. Specially for builds that we know are all plain of bad habits but we can't upgrade them to latest release because they are incompatibles. I reassure you, we all know and we all pin everything, i hope, in stable stacks including setuptools itself, i hope. But it's only closing eyes and that problem will it us badly sooner or later. That's why we mostly all have also dev pipelines with some laxism in certain dependencies toinstallable anticipate such changes.
I knew better 10 years ago than now setuptools code, and specially twine, but the command could fail at upload time. And server side speaking, we can also add an extra check as a belt and braces approach. On the other hand, if nothing is done to revert that change, for already uploaded artifacts that would be scanned as broken, patching them by stripping out the |
Where were these warnings supposed to show up? I didn't see them and this afternoon when trying to build new container images, the build process failed, because of newlines in I have Python package dependencies pinned using "pip-compile" but I don't have "pip" and "setuptools" pinned since they get installed by "ensurepip" as part of the CPython build. I guess I need to go into that build automation and pin the pip and setuptools versions somehow. It seems there is a significant risk of new versions of setuptools breaking my builds. I think pinning versions is a good idea in terms of security (e.g. avoiding malware injected into the supply chain). I think it's not so great when we have to do it for compatibility reasons. I recently had to spend quite a bit of time to try to get |
They show up in the pip log if you run with
I do acknowledge that it's a poor experience that warnings from a backend are completely invisible during normal operation. Probably that's something that can be addressed in the packaging ecosystem (/pypa/packaging-problems), but not something that Setuptools can do. I do observe that one can run with
But still "UserWarning" doesn't match precisely. Better would be if the filter could match "SetuptoolsDeprecationWarning", but alas, the filter doesn't seem to be able to honor warning subclasses. |
It occurred to me that in this scenario, there's another option - the maintainer of an affected package can provide a bugfix to the specifically-affected versions in the wild. In the reported scenario, django-hijack==2.1.10 is affected, but the latest release is not. Presumably the version was pinned for a specific compatibility issue. Regardless, it's the responsibility of the person or project that pinned the version to address the consequences of the pin. In this case, I'd recommend for that user to reach out to the django-hijack project and request a new version be released with the easy fix, something like 2.1.11 or 2.1.10.1. |
In pypa/pip#10669, I've recognized a limitation in the build/install process that would make it difficult if not impossible to adopt some of the above workarounds, especially in a PEP 517 isolated build environment. To that end, I'm proposing an extension of pip that would give installers more control over the builds. If such a feature were to be implemented, it would readily facilitate working around issues like these. |
Seriously, I would be curious to find one maintainer that will be willing to patch every dotpoint release for a single I understand the point of setuptools maintainers when we are going on the theory of "Who is responsible for an Also, i totally disagree on one responsibility point: you can't put it of each maintainer to patch all of their releases that were still legit at upload time and so should still remain installable except security problems (moreover when we are speaking of a Also, we, as users, don't want to use any "workaround" for such simple cases, as already said nascheme. So any of those is unacceptable in long term and really increase the maintenance burden: pinning way down eternally setuptools, using a pip plugin, patching requirements to use a constrained setuptools version as proposed in pypa/pip#10669, specially when it will come to mix versions for differents packages that have different needs. (Dont misread me, details have all their importance, but they still have to be balanced) |
Sorry, but no, this is just a temporary fix to continue operation and this bug is still present. Formulating workarounds are not addressing the issue of keeping stability and retrocompatiblity. As i said, we use longterm maintenance scenarii with everything pinned including setuptools, and short term ones where we don't just to detect by advance further breakages like this one, bingo... Here we are, to currently use a temporary workaround, and we so are considering our pipelines unstable and partially broken as this day and the solutions you proposed. We dont either want to stay blocked with a previous setuptools version which may become unsupported, or patch every single pipeline we got to add another one workaround plugin. Something is really wrong in this flow. |
It seems to me that the current cure is much worse than the disease. Couldn't new versions of setuptools refuse to do uploads of packages with junk in the metadata? Also, it could warn loudly when building or installing packages with bad metadata (not requiring the -v flag to see the warnings). If the description is so broken that it can't be used, I would suggest that using a generic description or no description at all would be better than outright refusing to install/build the package. Is junk in the description really a serious enough problem to make the package unusable? The package installed with previous versions of setuptools and the resulting application passed all of our QC testing. In my case, the package was never uploaded to PyPI and I don't think anything ever looked at the description. |
Setuptools doesn't do uploads, but even if it did, I explained above why it's not possible to detect the invalid data reliably.
Yes, that's probably an action worth investigation. Unfortunately, it would probably require extensions to the standards between build frontends and build backends (such that warnings in the backend could be surfaced in the frontend). As it is now, pip only surfaces errors in the backend and otherwise hides the output. If someone is interested, that would be an interesting and valuable challenge to tackle, but certainly more than a quick patch.
See #1390 why this was an issue. Yes, it would be trivial to implement a change that transforms invalid input to valid metadata. That's what Setuptools has done for the past 6 months. Unfortunately, as you can see, that change had little if any effect on the invalid packages (even new releases). I've pointed out above I don't believe there's any way to distinguish between prior invalid metadata and brand new invalid metadata. If someone can propose such a mechanism, I'd be interested in exploring a fix.
I'm glad to hear this change has helped you identify the invalid and correct the invalid metadata. At this point, I'm convinced the change as drafted has had the intended effect (even if more disruptive than originally intended) and there are no open proposals on how to move forward. I still welcome proposals on how to move forward, but any proposal should include a proof-of-concept implementation, as the nuances seem to be getting lost in the discussion. |
I know that you are not a big fan of words with strength, but i won't be alone to think it's like sabotage. I don't know what is bothering me more between knowing that setuptools broke retro-compatibility totally on purpose for no good reason (at least not enough legitimate to not let install a package that installed fine in the past and has no security concern today) or that something is not done to repair something which has an obvious flaw. There is some kind of double talk when your users are still have to use workarounds to install packages, at equivalent pinning, and have no longterm solutions, and you just say them that there are solutions and "everything works as expected, no issue". The issues you are describing here look like HTTP Smuggling, but the countermeasures taken are like using an sledgehammer to kill a fly. As filtering out bad formatted fields is possible, it was already done before, @nascheme had a good idea to restore non breaking mode with a long, big, verbose message, without needed extra arguments, and maybe i would suggest to send a mail to maintainers on upload to pypi to make the issue really visible to maintainers. But please, restore the old behavior or arrange it, or fix already released packages with the final purpose to let them install without errors as before and don't put this burden on users or maintainers (that should for them had to be done before letting them upload). Arbitrary changing the rules of the game really would really upset most of us and the majority would always keep silent, and maybe doesn't even identify the root of the problem. EDIT: Well the time.sleep() is maybe not a good idea as it will continue to break already released artifacts... removing it |
Thanks for breaking my toolchain :) Doubt some of the packages I use will update their descriptions, so I will freeze setuptools to 58 for the foreseeable future. |
Thanks Jakob for reporting. Can you share some of the packages that affect your environment? I’d you consider using the workaround plugin so you wouldn’t have to pin Setuptools? If not, why not? |
I strongly support the request to let setuptools strip out the excess newlines, rather than making this a hard error and essentially breaking the installation of existing Python package releases that happen to have a newline in the I fully understand the motivation for trying to get people to not using newlines in packages (cfr. #1390), but frankly I think this is the wrong approach... Considering looking at this from the point of view of someone who is not very familiar with installing Python packages.
First of all: the error is very confusing, since it doesn't mention at all where newlines are not allowed. In the code? In the package name? What's going on?! The first reaction from someone hitting this error is most likely going to be " Even a simple workaround like using I have seen my share of problems with getting software installed, and I must say it took me a while to get my head around what was happening when I first ran into this problem. Please consider changing this aggressive behavior of breaking a package installation for what most would consider a very silly problem that is easy to bypass (by just stripping out the newlines, replacing them with a space), before the setuptools versions that make this a hard error are being picked up in various Linux distributions and start causing more trouble around the world. It's now clear that there are dozens of Python packages out there that have a range of versions that are essentially no longer installable with a recent version of setuptools, although there's no good reason for it (since the workaround in setuptools is trivial). I'm happy to contribute to help with working out a better approach for dealing with excess newlines in package descriptions, without massively disrupting the user experience. |
I just want my old packages to keep building... IMHO it makes no sense to introduce a breaking change on old released packages due to package metadata. Yes, I could pin the setuptools version and that's exactly what I'll do, but I'm a developer with 10 years of Python experience: I can do that, I know pinning everything is necessary for reproducible and future-proof builds. But this issue is about the net gain for the community. What's the harm caused by this change and what's the gain for the Python community? Picture a beginner dev trying to install a not-so-old library that always had those newlines and getting this error. This change is basically blocking the usage of several old Python packages for many developers. This is not nice at all. --- Edit: |
I upgraded setuptools just now on my server, and come to this error, then lead me to this discussion. for the reason why I stupid upgrade setuptools is that it introduced a lot of errors internally on setuptools old version. I don't know what's wrong with newlines in description, but I strongly recommand avoid it gracefully rather than broken all packages in this world, this setuptools is so foundamental that it's not good to easily and ignore users experiences just for this tiny can be avoided newlines mistakes. To be honest, please stop raise any errors, using a warning instead.... |
The bug really shows what a mess is the current Python packaging ecosystem nowadays. There are many reasonable complaints in the thread. What should we do if we have several dated, but quite ok packages like https://github.com/jucacrispim/asyncblink? @jaraco your workaround package with such an awkward name as I'm pinning |
Perhaps it feels like trolling, but I assure you, it's not. It is meant to have a descriptive but unfriendly name to make it obvious to the person implementing it that it's meant to be a workaround with a very specific purpose and as a wart reflect that it's not an optimal solution. I know Here it is in action:
This question feels like trolling, especially when there's a thread above that provides no less than 6 potential workarounds. Additionally, I've provided guidance above on what I'd like to see to change the situation - a proposal with a demonstration of how you would go about fixing the issue. The only concrete, actionable proposal given thusfar is to just revert and allow the broken behavior forever. Other proposals are inactionable because they make assumptions and don't take into consideration the considerations I've enumerated above. Furthermore, I don't get the impression that anyone has even tried the plugin. I get it, you're offended that in principle setuptools should support building older packages. I agree. But the issue hasn't been important enough that someone could step up and come up with a solution or even volunteer to own the debt that reverting the change would have. And no one has stepped up to help with developing a solution between pip and build backends to surface warnings. It's a lot of work just to roll out this change, first issuing the warning months ago, then following up, cutting a backward-incompatible release, then responding to the fallout. Now you're asking that we back out that change in order to satisfy a small number of broken packages, undoing that work and re-accumulating the debt. And to be sure, the number and impact is larger than I had anticipated, but I'm asking for help here. Please help us make the ecosystem better by sending the signal to these packages that their metadata is invalid. Please help be a part of the solution. Thanks. |
In v59.4.0, I've reverted the change, restoring support for bad metadata. |
Thx @jaraco ! |
Closes https://bugs.archlinux.org/task/73510 Although setuptools >= 59.4.0 has reverted the backward-incompatible change [1], Arch package is not updated yet. [1] pypa/setuptools#2893 git-svn-id: file:///srv/repos/svn-community/svn@1115671 9fca08f4-af9d-4005-b8df-a31f2cc04f65
Closes https://bugs.archlinux.org/task/73510 Although setuptools >= 59.4.0 has reverted the backward-incompatible change [1], Arch package is not updated yet. [1] pypa/setuptools#2893 git-svn-id: file:///srv/repos/svn-community/svn@1115671 9fca08f4-af9d-4005-b8df-a31f2cc04f65
setuptools version
>= 59
Python version
3.8
OS
linux
Additional environment information
No response
Description
builds failing with
Expected behavior
Warning or nothing, just escape silently
\n
for oneline metadata fieldsLot of already released eggs wont ever comply to this policy, but we need to install them at the version they are pinned in our builds for the sake of reproducibility.
How to Reproduce
pip install django-hijack==2.1.10
Output
Code of Conduct
The text was updated successfully, but these errors were encountered: