-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
--download
fails when using local --find-links
#1111
Comments
+1 for supporting --download with local --find-links. Here is my actual use case: I want to download all of the requirements of a Python application using a pip command and the requirements.txt for that application. Some of the requirements are available on PyPI, but others are available locally, in a sub-directory named "local". Contents of my requirements.txt:
Contents of my "local" sub-directory:
(acpython-1.3.tar.gz is a source tarball created using "python setup.py sdist".) pip command:
Afterwards, the download directory contains the downloaded python-glanceclient and python-novaclient .tar.gz files, but also the extracted contents of acpython, which is not helpful. |
Here is the result of my investigation. I hope this is helpful. Pip unpacks every package and looks at its egg info, even when you are just trying to download. When you are only downloading, pip creates a temporary directory for each package, into which it tries to unpack the downloaded package, so that it can look at its egg info. So far, so good. The problem is that the unpack method behaves differently depending on whether the package was downloaded from a file url or from an http url. See the unpack_url() method: Pip 1.4.1 def unpack_url(self, link, location, only_download=False):
if only_download:
loc = self.download_dir
else:
loc = location
if is_vcs_url(link):
return unpack_vcs_link(link, loc, only_download)
# a local file:// index could have links with hashes
elif not link.hash and is_file_url(link):
return unpack_file_url(link, loc)
else:
if self.download_cache:
self.download_cache = os.path.expanduser(self.download_cache)
retval = unpack_http_url(link, location, self.download_cache, self.download_dir)
if only_download:
write_delete_marker_file(location)
return retval In my case, this method is called with:
For some reason, if only_download is true, unpack_url() chooses to ignore the location parameter, and the file is unpacked into the download directory instead of into the temp directory that pip just created. pip fails after it returns from this method, because the later code that gets the egg info expects the unpacked contents to be in the temp directory, not in the download directory. |
Bump this, as it's also causing issues with pip2pi, which recently switched to using |
Also, to address the question of use-cases: this is important for |
closing due to merge of #1524 @wolever regarding your comment (#1111 (comment)), that case should work (in develop branch) for sdists (*.tar.gz). wheels currently have a bug though (#1112) |
Awesome, this does look like it addresses the issue. Thanks! |
As discovered in #1107,
--download
fails to work correctly with local--find-links
you get no error, but you get the contents of the archive dumped into the download dir, and not the archive.
pip.download.unpack_file_url
has nodownload_dir
logic likepip.download.unpack_http_url
does.Maybe we shouldn't support this, and just raise a command error saying the options are incompatible. It's odd in the first place to want to download from a local store.
The text was updated successfully, but these errors were encountered: