You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a local Pypiserver used in production to speed up installation. It has wheels for all the packages I use, so whenever pypi.org does not have a wheel for one, I build it myself and add it to my Pypiserver, but I just download it if it's available. Because I build some wheels, I would like to add their corresponding hashes to requirements.txt when I run :
I also would like to keep that --extra-index-url https://pypi.org/simple to know if some packages have new releases that are not in our local index. From what I understand, pip-compile will always try to install the most recent version allowed by the input's constraints.
My main issue is that pip-compile always uses hashes from pypi.org even if my server has the same versions. I do not know much about pip-compile's implementation, but after some debugging I found out it's using this Warehouse JSON API and Pypiserver does not support it as far as I know (I tried with 1.2.1 and 1.4.2 which is also the latest).
What has been tried
I tried adding the hashes to my requirements.in, but they were not taken into account. I also tried manually adding hashes to requirements.txt, however they were removed when I re-ran the command (as expected).
Possible feature requests
I did not create this issue using the feature request template as pip-tools might have some functionality I'm not aware of that could support my intended workflow. However, I do have some things in mind if that's not the case and I can create a separate issue for any of them:
Add a flag to pip-compile to always generate hashes from files:
If I remove extra-index-url , I get several messages saying Couldn't get hashes from PyPI, fallback to hashing files. If I could somehow trigger this functionality directly instead of as a fallback, it would solve my issue.
Use hashes from input if available:
This would also work, but I would have to manually add new hashes for custom wheels, even for transitive dependencies.
Get hashes directly from Pypiserver:
Probably the cleanest solution and we already have consume .whl.METADATA files when available #1211 describing it, but looking at the warehouse issue it points to, it does not look like it will be implemented soon.
Thank you for this great piece of software! I really appreciate your time and help.
The text was updated successfully, but these errors were encountered:
Add a flag to pip-compile to always generate hashes from files:
If I remove extra-index-url , I get several messages saying Couldn't get hashes from PyPI, fallback to hashing files. If I could somehow trigger this functionality directly instead of as a fallback, it would solve my issue.
From what I understand, pip-compile will always try to install the most recent version allowed by the input's constraints.
Well, it also checks for an existing output file, and doesn't upgrade higher than the version specs there if it doesn't have to, to meet the input requirements.
Description
I have a local Pypiserver used in production to speed up installation. It has wheels for all the packages I use, so whenever pypi.org does not have a wheel for one, I build it myself and add it to my Pypiserver, but I just download it if it's available. Because I build some wheels, I would like to add their corresponding hashes to requirements.txt when I run :
I also would like to keep that
--extra-index-url https://pypi.org/simple
to know if some packages have new releases that are not in our local index. From what I understand, pip-compile will always try to install the most recent version allowed by the input's constraints.My main issue is that
pip-compile
always uses hashes from pypi.org even if my server has the same versions. I do not know much about pip-compile's implementation, but after some debugging I found out it's using this Warehouse JSON API and Pypiserver does not support it as far as I know (I tried with 1.2.1 and 1.4.2 which is also the latest).What has been tried
I tried adding the hashes to my requirements.in, but they were not taken into account. I also tried manually adding hashes to requirements.txt, however they were removed when I re-ran the command (as expected).
Possible feature requests
I did not create this issue using the feature request template as pip-tools might have some functionality I'm not aware of that could support my intended workflow. However, I do have some things in mind if that's not the case and I can create a separate issue for any of them:
If I remove
extra-index-url
, I get several messages sayingCouldn't get hashes from PyPI, fallback to hashing files
. If I could somehow trigger this functionality directly instead of as a fallback, it would solve my issue.This would also work, but I would have to manually add new hashes for custom wheels, even for transitive dependencies.
Probably the cleanest solution and we already have consume .whl.METADATA files when available #1211 describing it, but looking at the warehouse issue it points to, it does not look like it will be implemented soon.
Thank you for this great piece of software! I really appreciate your time and help.
The text was updated successfully, but these errors were encountered: