-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
the problem with replacing dependency links with PEP 508 URL requirements #5898
Comments
Adding the deprecation label because it relates to a pending deprecation / whether something should be deprecated. |
I think so, given that they're supposed to be the intended replacement for dependency links. Perhaps a discussion over at distutils-sig is needed for that. |
I agree, this should be raised on distutils-sig. However, I'd prefer it if the discussion were framed as "how to make URL links a complete replacement for dependency links" rather than just suggesting this single change. Ideally, this is the only change needed (as someone who doesn't use dependency links, I can't really comment on that) but I think the key here is to get community agreement that URL links are an acceptable replacement, so that we can finally retire dependency links without needing to worry about the possibility that someone pops up with a use case we hadn't considered. (Of course, that may still happen, but it's easier to point to a distutils-sig discussion that was everyone's chance to speak up, than to simply say "we didn't think of that" :-)) |
I've been using @ URL links (with @benoit-pierre's patch from #5780 (comment)) with a lot of success. We simply specify the exact build that is needed in the URL which seems to work for us. I can see however that the ability to do version matching on that string would be helpful. |
FWIW, from trying to use PEP 508 URL, the lack of version specifiers was crippling and I'd also be in favor of adding them. |
What's the status of this issue? I was really surprised that dependency links were removed without addressing this issue first. Version specifiers are really important. Any news? |
So, I started this thread on distutils-sig. I raised this issue and discussed with the developers. In the end, I was convinced that changing PEP 508 to include version specifiers is both impractical and unnecessary. I'd like to explain while I reached this conclusion to other pip users:
|
I have a setup like this that is now broken:
When I install package I know there's PEP508. How do I tell pip to get the |
Excuse me my rant, apperently |
[Edit: Just read the whole conversation] @stinovlas Just trying to understand and reach your same conclusion. What did you mean by "private packages that depend on each other in not-trivial way" are you just referring to latest/versioned url support? |
It does work. But, you can only depend on one specific commit-ish (i.e. a branch). You can't say "I want version >= 3.2 from this repository.". |
since `--process-dependency-links` has been deprecated and removed in pip >= 19.0 For reference: pypa/pip#5898
Here's a workaround for anyone interested (yes it's ugly)
|
So it's been mentioned that you should just run your own index server if you need a dependency URL. That's fine and dandy, except I literally can't get pip to look in my index server if I specify a dependency from it in setup.py. Can somebody explain how you're supposed to specify an index server in setup.py without dependency_links, and without pip respecting my environment while building a wheel? |
@stefansjs You do not specify an index server in setup.py. Instead, the user chooses what index to install from when they install you package. |
Forgive my ignorance here. How do I then build my wheel to publish in my private index server if it has dependencies in my private index server? It seems like not respecting index-url at build-time and not respecting dependency_links at build-time pretty much eliminates hosting my own index server. What am I missing about building my own packages to host on my own index server? |
I'd like to cover one of the use cases. Suppose the project is developed in-house and consists of several python packages. During the development, the main package is installed via Once the development is over, the project is achieved (via Currently, the last command tries to access URLs in VCS specifiers, because |
Everyone, before commenting further here, please read this thread on distutils-sig. @pypa/pip-committers Given that this has been discussed (and that discussion summarized here: #5898 (comment)), and the concensus was to not change anything beyond status quo; is there anything actionable here? |
I have another use case that PEP 508 does not cover well: finding wheels. pytorch's pypi version is very large: it includes all the fancy GPU code needed for training neural network6s, and that, compiled, is about 600MB. That's way too much to ask our users to install, and it's too much even to ask CI to install. And we definitely don't want to be having users compiling torch from scratch. To address the elephant in the room, pytorch has a solution: they provide To get them, then, we put this in a I want to migrate towards having everything in setup.py, because I want to eventually be able to package and release on pypi ourselves. But
makes no sense. It pins my package to linux. I wish I could tell setup() Is the answer going to be that packages on pypi have to only depend on other packages on pypi? I guess that's a reasonable answer but pragmatically I've gotta deal with non-pypi sources and I wish I could keep all my dependency metadata in one place. wild workaroundMaybe I could use more explicit https://www.python.org/dev/peps/pep-0508/#environment-markers, but that gets super verbose super quickly, and pip already has this logic built in:
|
But keep requirements.txt, because we need to use it for pytorch (pytorch/pytorch#26340) and --find-links (-f) isn't supported by pip's new URL pinning format: pypa/pip#5898 (comment)
But keep requirements.txt, because we need to use it for pytorch (pytorch/pytorch#26340) and --find-links (-f) isn't supported by pip's new URL pinning format: pypa/pip#5898 (comment)
No, but the end user needs to explicitly opt into any other index being used. This was a deliberate policy decision, to prevent malicious code on PyPI triggering download of code from other, arbitrary, locations. So you depend on whatever version of pytorch you want, and instruct your user to add |
Thank you for explaining the history there. I can understand the reasoning. But I don't think it was that effective, because nothing stops malicious or just vulnerable code from ending up on PyPI:
Using the PEP 508 URL format I can make packages, even ones on PyPI, depend on arbitrary outside locations. For example:
For pure-python source packages this works every time. It would be helpful if that worked for wheels too. And there's another way to circumvent user opt-in: you can hide the
This is all pretty inconsistent and confusing :/. In practice it just means, in order for devs to minimize the headache we give our users, that we'll write scripts like this. Our users don't know what pip is or who runs pypi. I don't even know who runs pypi, but I assume they've got it in hand. Our users definitely haven't thought through the implications of contacting this domain vs that domain. I'm sorry for complaining. I know pypa is a big project and this is one more straw on the back. I think you're doing good work shaping all this clay! And that it's a lot to consider! Here I want to make sure this one use case isn't forgotten: you support getting source packages from arbitrary URLs, so please also support binary packages the same way. |
You are free to disagree with the policy, but the issue is outside the pip developers’ hands and need to be handled in pypa/warehouse instead. |
It's about how pip chooses servers to download from. Here's something that would solve my use case: reintroduce find-links as a PEP 508 URL scheme, maybe
|
If malicious code is already on PyPi. Why would it need to pull in code from other locations? Why not just push more malicious code to the same package? Or to another package and push that to PyPi? I think the desire to make pip more secure is great. But I don't think the mitigation that was taken is effective, (see above |
I recant. I made a project with
pypi rejects it when uploaded in wheel form, but not sdist form; but pip rejects the sdist form when it downloads it. Details at https://github.com/kousu/donthackpypi Okay. So I can accept that if you want to use non-pypi code you have to tell your users:
ungainly but it's not worse than the others. |
Yeah! Totally. By the way, the workarounds you posted run in
|
I'd just say that it wasn't entirely about malicious code (although that was part of it), and was largely around user expectations. Users expect that, absent additional configuration, invoking Dependency links broke this expectation, and brought along with it a rash of issues where "pip install " would depend on an unknownable set of servers outside of just their configured locations. This caused a number of problems, most obviously in locked down environments where accessing a service required getting a hole punched in a firewall, but also would render people's attempts to mirror PyPI moot because these external links wouldn't get mirrored. Basically, a user should be in charge of where they download files from, it should not be something under someone else's control. Anything that takes that control away from end users is not going to come back and if the URL form of PEP 508 works on PyPI, that's a bug that needs fixed. It should not work for any type of distribution uploaded to PyPI. |
But keep requirements.txt, because we need to use it for pytorch (pytorch/pytorch#26340) and --find-links (-f) isn't supported by pip's new URL pinning format: pypa/pip#5898 (comment)
My 2 cents: PEP 508 URLs are no replacement for dependency links, because of the lack of version specifiers.
For example, with dependency links, you can push a package on PyPI, with dependencies on other PyPI project, but the option to use a patched version for some of those dependencies (with some extra bug fixes) when using
--process-dependency-links
. I've use that myself on a project depending on PyObjc because the delay between releases is so long.Additionally, the lack of version specifiers mean there's no way for pip to know if a existing installation is compatible or not: this is problematic when upgrading a project depending on another through a PEP 508 direct URL, but also make sharing a such a dependency between projects problematic.
And finally, dependency links are opt-in, and usable on PyPI. But PEP 508 URLs are forbidden by pip during install for projects originating from PyPI: for "security reasons". This, to me, does not really make sense: it's not like installing from PyPI only is secure!
That last point could be addressed by changing the behaviour in pip (maybe a
--allow-url=regexp
option?), but I don't see a way around the lack of version specifiers. Could the PEP be amended to allowpackage >= 10.0 @ url
?The text was updated successfully, but these errors were encountered: