https://pip.pypa.io/en/stable/
- Overview in the README
- Specifying packages to process
- requirements.txt
- Project metadata
- Distribution formats
- Using fetched dependencies
- Troubleshooting
The "pip packages" that Cachi2 can process are root directories of Python projects. They should have:
- One or more requirements files (unless the project has no dependencies)
- A file defining the project metadata
cachi2 fetch-deps \
--source ./my-repo \
--output ./cachi2-output \
'<JSON input>'
JSON input:
{
"type": "pip",
// path to the package (relative to the --source directory)
// defaults to "."
"path": ".",
// specify requirements files (relative to the package path)
// defaults to ["requirements.txt"] or [] if the file does not exist
"requirements_files": ["requirements.txt", "requirements-extra.txt"],
// specify *build* requirements files
// defaults to ["requirements-build.txt"] or [] if the file does not exist
"requirements_build_files": ["requirements-build.txt"],
// option to allow fetching binary distributions (wheels)
// defaults to "false"
"allow_binary": "false",
}
For more information on using build requirements and binary distributions, see Distribution Formats section.
The main argument accepts alternative forms of input, see usage: Pre-fetch dependencies.
Cachi2 downloads dependencies explicitly declared in lockfiles. For pip, the closest thing to a lockfile would be a "fully resolved" requirements.txt - must contain all the transitive dependencies, must pin them to exact versions.
A good way to generate requirements.txt is via pip-compile. Note that pip-compile supports reading dependencies directly from project files (e.g. pyproject.toml, setup.cfg, setup.py) or from "requirements.in" input files.
Example: pyproject.toml
[project]
name = "my_package"
version = "0.1.0"
dependencies = [
"requests",
"dockerfile-parse @ https://github.com/containerbuildsystem/dockerfile-parse/archive/refs/tags/2.0.0.tar.gz"
]
pip-compile pyproject.toml --generate-hashes
Example: requirements.in
# requirements.in
requests
dockerfile-parse @ https://github.com/containerbuildsystem/dockerfile-parse/archive/refs/tags/2.0.0.tar.gz
pip-compile requirements.in --generate-hashes
Result: requirements.txt
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile --generate-hashes pyproject.toml
#
certifi==2022.12.7 \
--hash=sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3 \
--hash=sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18
# via requests
charset-normalizer==3.0.1 \
--hash=sha256:00d3ffdaafe92a5dc603cb9bd5111aaa36dfa187c8285c543be562e61b755f6b \
--hash=sha256:024e606be3ed92216e2b6952ed859d86b4cfa52cd5bc5f050e7dc28f9b43ec42 \
--hash=sha256:0298eafff88c99982a4cf66ba2efa1128e4ddaca0b05eec4c456bbc7db691d8d \
--hash=sha256:02a51034802cbf38db3f89c66fb5d2ec57e6fe7ef2f4a44d070a593c3688667b \
--hash=sha256:083c8d17153ecb403e5e1eb76a7ef4babfc2c48d58899c98fcaa04833e7a2f9a \
--hash=sha256:0a11e971ed097d24c534c037d298ad32c6ce81a45736d31e0ff0ad37ab437d59 \
--hash=sha256:0bf2dae5291758b6f84cf923bfaa285632816007db0330002fa1de38bfcb7154 \
--hash=sha256:0c0a590235ccd933d9892c627dec5bc7511ce6ad6c1011fdf5b11363022746c1 \
--hash=sha256:0f438ae3532723fb6ead77e7c604be7c8374094ef4ee2c5e03a3a17f1fca256c \
--hash=sha256:109487860ef6a328f3eec66f2bf78b0b72400280d8f8ea05f69c51644ba6521a \
--hash=sha256:11b53acf2411c3b09e6af37e4b9005cba376c872503c8f28218c7243582df45d \
--hash=sha256:12db3b2c533c23ab812c2b25934f60383361f8a376ae272665f8e48b88e8e1c6 \
--hash=sha256:14e76c0f23218b8f46c4d87018ca2e441535aed3632ca134b10239dfb6dadd6b \
--hash=sha256:16a8663d6e281208d78806dbe14ee9903715361cf81f6d4309944e4d1e59ac5b \
--hash=sha256:292d5e8ba896bbfd6334b096e34bffb56161c81408d6d036a7dfa6929cff8783 \
--hash=sha256:2c03cc56021a4bd59be889c2b9257dae13bf55041a3372d3295416f86b295fb5 \
--hash=sha256:2e396d70bc4ef5325b72b593a72c8979999aa52fb8bcf03f701c1b03e1166918 \
--hash=sha256:2edb64ee7bf1ed524a1da60cdcd2e1f6e2b4f66ef7c077680739f1641f62f555 \
--hash=sha256:31a9ddf4718d10ae04d9b18801bd776693487cbb57d74cc3458a7673f6f34639 \
--hash=sha256:356541bf4381fa35856dafa6a965916e54bed415ad8a24ee6de6e37deccf2786 \
--hash=sha256:358a7c4cb8ba9b46c453b1dd8d9e431452d5249072e4f56cfda3149f6ab1405e \
--hash=sha256:37f8febc8ec50c14f3ec9637505f28e58d4f66752207ea177c1d67df25da5aed \
--hash=sha256:39049da0ffb96c8cbb65cbf5c5f3ca3168990adf3551bd1dee10c48fce8ae820 \
--hash=sha256:39cf9ed17fe3b1bc81f33c9ceb6ce67683ee7526e65fde1447c772afc54a1bb8 \
--hash=sha256:3ae1de54a77dc0d6d5fcf623290af4266412a7c4be0b1ff7444394f03f5c54e3 \
--hash=sha256:3b590df687e3c5ee0deef9fc8c547d81986d9a1b56073d82de008744452d6541 \
--hash=sha256:3e45867f1f2ab0711d60c6c71746ac53537f1684baa699f4f668d4c6f6ce8e14 \
--hash=sha256:3fc1c4a2ffd64890aebdb3f97e1278b0cc72579a08ca4de8cd2c04799a3a22be \
--hash=sha256:4457ea6774b5611f4bed5eaa5df55f70abde42364d498c5134b7ef4c6958e20e \
--hash=sha256:44ba614de5361b3e5278e1241fda3dc1838deed864b50a10d7ce92983797fa76 \
--hash=sha256:4a8fcf28c05c1f6d7e177a9a46a1c52798bfe2ad80681d275b10dcf317deaf0b \
--hash=sha256:4b0d02d7102dd0f997580b51edc4cebcf2ab6397a7edf89f1c73b586c614272c \
--hash=sha256:502218f52498a36d6bf5ea77081844017bf7982cdbe521ad85e64cabee1b608b \
--hash=sha256:503e65837c71b875ecdd733877d852adbc465bd82c768a067badd953bf1bc5a3 \
--hash=sha256:5995f0164fa7df59db4746112fec3f49c461dd6b31b841873443bdb077c13cfc \
--hash=sha256:59e5686dd847347e55dffcc191a96622f016bc0ad89105e24c14e0d6305acbc6 \
--hash=sha256:601f36512f9e28f029d9481bdaf8e89e5148ac5d89cffd3b05cd533eeb423b59 \
--hash=sha256:608862a7bf6957f2333fc54ab4399e405baad0163dc9f8d99cb236816db169d4 \
--hash=sha256:62595ab75873d50d57323a91dd03e6966eb79c41fa834b7a1661ed043b2d404d \
--hash=sha256:70990b9c51340e4044cfc394a81f614f3f90d41397104d226f21e66de668730d \
--hash=sha256:71140351489970dfe5e60fc621ada3e0f41104a5eddaca47a7acb3c1b851d6d3 \
--hash=sha256:72966d1b297c741541ca8cf1223ff262a6febe52481af742036a0b296e35fa5a \
--hash=sha256:74292fc76c905c0ef095fe11e188a32ebd03bc38f3f3e9bcb85e4e6db177b7ea \
--hash=sha256:761e8904c07ad053d285670f36dd94e1b6ab7f16ce62b9805c475b7aa1cffde6 \
--hash=sha256:772b87914ff1152b92a197ef4ea40efe27a378606c39446ded52c8f80f79702e \
--hash=sha256:79909e27e8e4fcc9db4addea88aa63f6423ebb171db091fb4373e3312cb6d603 \
--hash=sha256:7e189e2e1d3ed2f4aebabd2d5b0f931e883676e51c7624826e0a4e5fe8a0bf24 \
--hash=sha256:7eb33a30d75562222b64f569c642ff3dc6689e09adda43a082208397f016c39a \
--hash=sha256:81d6741ab457d14fdedc215516665050f3822d3e56508921cc7239f8c8e66a58 \
--hash=sha256:8499ca8f4502af841f68135133d8258f7b32a53a1d594aa98cc52013fff55678 \
--hash=sha256:84c3990934bae40ea69a82034912ffe5a62c60bbf6ec5bc9691419641d7d5c9a \
--hash=sha256:87701167f2a5c930b403e9756fab1d31d4d4da52856143b609e30a1ce7160f3c \
--hash=sha256:88600c72ef7587fe1708fd242b385b6ed4b8904976d5da0893e31df8b3480cb6 \
--hash=sha256:8ac7b6a045b814cf0c47f3623d21ebd88b3e8cf216a14790b455ea7ff0135d18 \
--hash=sha256:8b8af03d2e37866d023ad0ddea594edefc31e827fee64f8de5611a1dbc373174 \
--hash=sha256:8c7fe7afa480e3e82eed58e0ca89f751cd14d767638e2550c77a92a9e749c317 \
--hash=sha256:8eade758719add78ec36dc13201483f8e9b5d940329285edcd5f70c0a9edbd7f \
--hash=sha256:911d8a40b2bef5b8bbae2e36a0b103f142ac53557ab421dc16ac4aafee6f53dc \
--hash=sha256:93ad6d87ac18e2a90b0fe89df7c65263b9a99a0eb98f0a3d2e079f12a0735837 \
--hash=sha256:95dea361dd73757c6f1c0a1480ac499952c16ac83f7f5f4f84f0658a01b8ef41 \
--hash=sha256:9ab77acb98eba3fd2a85cd160851816bfce6871d944d885febf012713f06659c \
--hash=sha256:9cb3032517f1627cc012dbc80a8ec976ae76d93ea2b5feaa9d2a5b8882597579 \
--hash=sha256:9cf4e8ad252f7c38dd1f676b46514f92dc0ebeb0db5552f5f403509705e24753 \
--hash=sha256:9d9153257a3f70d5f69edf2325357251ed20f772b12e593f3b3377b5f78e7ef8 \
--hash=sha256:a152f5f33d64a6be73f1d30c9cc82dfc73cec6477ec268e7c6e4c7d23c2d2291 \
--hash=sha256:a16418ecf1329f71df119e8a65f3aa68004a3f9383821edcb20f0702934d8087 \
--hash=sha256:a60332922359f920193b1d4826953c507a877b523b2395ad7bc716ddd386d866 \
--hash=sha256:a8d0fc946c784ff7f7c3742310cc8a57c5c6dc31631269876a88b809dbeff3d3 \
--hash=sha256:ab5de034a886f616a5668aa5d098af2b5385ed70142090e2a31bcbd0af0fdb3d \
--hash=sha256:c22d3fe05ce11d3671297dc8973267daa0f938b93ec716e12e0f6dee81591dc1 \
--hash=sha256:c2ac1b08635a8cd4e0cbeaf6f5e922085908d48eb05d44c5ae9eabab148512ca \
--hash=sha256:c512accbd6ff0270939b9ac214b84fb5ada5f0409c44298361b2f5e13f9aed9e \
--hash=sha256:c75ffc45f25324e68ab238cb4b5c0a38cd1c3d7f1fb1f72b5541de469e2247db \
--hash=sha256:c95a03c79bbe30eec3ec2b7f076074f4281526724c8685a42872974ef4d36b72 \
--hash=sha256:cadaeaba78750d58d3cc6ac4d1fd867da6fc73c88156b7a3212a3cd4819d679d \
--hash=sha256:cd6056167405314a4dc3c173943f11249fa0f1b204f8b51ed4bde1a9cd1834dc \
--hash=sha256:db72b07027db150f468fbada4d85b3b2729a3db39178abf5c543b784c1254539 \
--hash=sha256:df2c707231459e8a4028eabcd3cfc827befd635b3ef72eada84ab13b52e1574d \
--hash=sha256:e62164b50f84e20601c1ff8eb55620d2ad25fb81b59e3cd776a1902527a788af \
--hash=sha256:e696f0dd336161fca9adbb846875d40752e6eba585843c768935ba5c9960722b \
--hash=sha256:eaa379fcd227ca235d04152ca6704c7cb55564116f8bc52545ff357628e10602 \
--hash=sha256:ebea339af930f8ca5d7a699b921106c6e29c617fe9606fa7baa043c1cdae326f \
--hash=sha256:f4c39b0e3eac288fedc2b43055cfc2ca7a60362d0e5e87a637beac5d801ef478 \
--hash=sha256:f5057856d21e7586765171eac8b9fc3f7d44ef39425f85dbcccb13b3ebea806c \
--hash=sha256:f6f45710b4459401609ebebdbcfb34515da4fc2aa886f95107f556ac69a9147e \
--hash=sha256:f97e83fa6c25693c7a35de154681fcc257c1c41b38beb0304b9c4d2d9e164479 \
--hash=sha256:f9d0c5c045a3ca9bedfc35dca8526798eb91a07aa7a2c0fee134c6c6f321cbd7 \
--hash=sha256:ff6f3db31555657f3163b15a6b7c6938d08df7adbfc9dd13d9d19edad678f1e8
# via requests
dockerfile-parse @ https://github.com/containerbuildsystem/dockerfile-parse/archive/refs/tags/2.0.0.tar.gz \
--hash=sha256:36e4469abb0d96b0e3cd656284d5016e8a674cd57b8ebe5af64786fe63b8184d
# via my-package (pyproject.toml)
idna==3.4 \
--hash=sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4 \
--hash=sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2
# via requests
requests==2.28.2 \
--hash=sha256:64299f4909223da747622c030b781c0d7811e359c37124b4bd368fb8c6518baa \
--hash=sha256:98b1b2782e3c6c4904938b84c0eb932721069dfdb9134313beff7c83c2df24bf
# via my-package (pyproject.toml)
urllib3==1.26.14 \
--hash=sha256:076907bf8fd355cde77728471316625a4d2f7e713c125f51953bb5b3eecf4f72 \
--hash=sha256:75edcdc2f7d85b137124a6c3c9fc3933cdeaa12ecb9a6a959f22797a0feca7e1
# via requests
https://pip.pypa.io/en/stable/topics/secure-installs/#hash-checking-mode
Using hashes is strongly recommended.
If using pip-compile, use the --generate-hashes
option.
For dependencies coming from somewhere other than PyPI, Cachi2 supports a subset of the PEP 440 direct references.
dockerfile-parse @ https://github.com/containerbuildsystem/dockerfile-parse/archive/refs/tags/2.0.0.tar.gz \
--hash=sha256:36e4469abb0d96b0e3cd656284d5016e8a674cd57b8ebe5af64786fe63b8184d
For https dependencies, Cachi2 requires exactly one --hash option as protection from remote tampering.
Note that if at least one dependency in your requirements file uses --hash, pip requires hashes for all dependencies.
Use pip-compile --generate-hashes
to generate compliant requirements files.
Cachi2 does not support PEP 440 hashes in the url fragment, only --hash options.
dockerfile-parse @ git+https://github.com/containerbuildsystem/dockerfile-parse@b6230230987950cfb16d8858c6f9a9642f4d0952
Git dependencies are incompatible with pip's hash checking. Please use an https url instead, if possible:
- dockerfile-parse @ git+https://github.com/containerbuildsystem/dockerfile-parse@b6230230987950cfb16d8858c6f9a9642f4d0952
+ dockerfile-parse @ https://github.com/containerbuildsystem/dockerfile-parse/archive/refs/tags/2.0.0.tar.gz \
+ --hash=sha256:36e4469abb0d96b0e3cd656284d5016e8a674cd57b8ebe5af64786fe63b8184d
If you do need to use a git url, Cachi2 requires that it specifies a full commit hash.
Cachi2 does not support PEP 440 commit hashes in the url fragment (the #
part), only directly after @
.
Note: it's impossible to craft a requirements.txt file that would download dependencies from both https urls and git urls. Cachi2 requires hashes for https. Using one --hash makes pip require hashes for everything. Pip does not support hashes for git dependencies. Please use https urls instead.
Requirements files support some pip install
options - see
https://pip.pypa.io/en/stable/reference/requirements-file-format/#supported-options.
Cachi2 supports a small subset of them, ignores those that are not relevant for prefetching, and raises an error for those that are relevant but aren't supported.
Supported since cachi2-v0.8.0.
Make Cachi2 download packages from the specified Python Package Index server.
Note: applies to all the packages (and only the packages) from the file which contains the --index-url
option.
If file A contains --index-url
and file B does not, Cachi2 will download the packages declared in B from the default
index server (https://pypi.org/simple/).
.netrc
file:
https://pip.pypa.io/en/stable/topics/authentication/#netrc-support
Enables hash-checking mode. Typically redundant, since the presence of any --hash
option enables hash-checking mode
as well.
Disables HTTPS validation for a host. Don't use this for production builds.
Specifies the expected hashes for package archives. See also the hashes section.
Cachi2 looks for the name and version of your project in the following project files:
If Cachi2 fails to resolve the project name, it will generate a name based on the git repository origin url (and package subpath if the package is not in the repository root). If Cachi2 fails to resolve the version, it will omit the version.
pyproject.toml: PEP 621 metadata
Supported cases:
[project]
name = "my_package"
version = "0.1.0"
Unsupported cases:
[project]
name = "my_package"
dynamic = ["version"]
Supported cases:
[metadata]
name = my_package
version = 0.1.0
[metadata]
name = my_package
version = file: VERSION
# taken from ./VERSION
# example content:
# 0.1.0
[metadata]
name = my_package
version = attr: my_package.VERSION
# taken from my_package/__init__.py or my_package.py
# example content:
# VERSION = "0.1.0"
Unsupported cases:
- missing
version
- some forms of
version = attr:
(those that would require executing the module)
Using setup.py is discouraged.
Supported cases:
setup(name="my_package", version="0.1.0", ...)
# basic variable usage is supported
NAME = "my_package"
VERSION = "0.1.0"
if __name__ == "__main__":
# setup() call can be anywhere in the file
setup(name=NAME, version=VERSION, ...)
Python packages typically distribute both the binary format (called wheel) and the source format (called sdist).
Wheels are much more convenient; they are the pre-built format, installing from a wheel amounts to unzipping the wheel and copying the files to the right place.
Sdists are more difficult to install. Pip must first build a wheel from the sdist using a PEP 517 build system. To do that, pip has to install the build system and its dependencies (defined via PEP 518).
Cachi2 (unlike the older Cachito) can download both wheels and sdists. The allow_binary
option controls this behavior.
"allow_binary": "true"
- download both wheels and sdists"allow_binary": "false"
- download only sdists (default)
Note: Cachi2 currently downloads one sdist and all the available wheels per dependency (no filtering is being made by platform or Python version).
Pre-fetching and building with wheels is much easier and faster than pre-fetching and building from source (even without filtering of wheels). However, downloading all the wheels naturally results in a much larger overall download size. Based on sample testing, wheels + sdists will be approximately 5x to 15x larger than just the sdists. When building with wheels, dealing with build dependencies via requirements-build.txt is unnecessary.
Building wheels from sdists takes a long time, but building from source gives you an important guarantee which using pre-built wheels does not: what you installed matches the source code. This can be especially important for Python packages implemented in C or other compiled languages.
To allow building from source in a network-isolated environment, Cachi2 must download all the PEP 517 build dependencies before the build starts.
Cachi2 requires a fully resolved requirements-build.txt to do this. The file follows the same rules as requirements.txt, but contains build dependencies rather than runtime dependencies.
Note: this file must contain all the transitive build dependencies of each of your transitive runtime dependencies (you are installing dependencies from source).
There's no great way to generate such a file. As far as we know, the best solution is pip-compile combined with this standalone script that lives in the old Cachito repo: pip_find_builddeps.py.
Generate a fully resolved requirements.txt
Get the script (download directly from Github, it has no runtime dependencies other than pip)
If your project itself has build dependencies (typically defined in pyproject.toml), copy them to requirements-build.in
.
Example: pyproject.toml build dependencies
# pyproject.toml
[build-system]
requires = ["pdm-pep517"]
build-backend = "pdm.pep517.api"
Copy dependencies from build-system.requires:
# requirements-build.in
pdm-pep517
Usage:
Run the pip_find_builddeps.py
script and pip-compile the output:
pip_find_builddeps.py requirements.txt \
--append \
--only-write-on-update \
-o requirements-build.in
pip-compile requirements-build.in --allow-unsafe --generate-hashes
Result: requirements-build.txt
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile --allow-unsafe --generate-hashes requirements-build.in
#
flit-core==3.8.0 \
--hash=sha256:64a29ec845164a6abe1136bf4bc5ae012bdfe758ed42fc7571a9059a7c80bd83 \
--hash=sha256:b305b30c99526df5e63d6022dd2310a0a941a187bd3884f4c8ef0418df6c39f3
# via -r requirements-build.in
pdm-pep517==1.0.6 \
--hash=sha256:8ec0c13cbacc9b94a9820be7db9e513c81a48ebca0bb826eb743fb1b22d144a0 \
--hash=sha256:a4407703d50fa4d671383a354868b05a13060c1bf38264cbb5ddc9a73e4a1dc5
# via -r requirements-build.in
setuptools==66.1.1 \
--hash=sha256:6f590d76b713d5de4e49fe4fbca24474469f53c83632d5d0fd056f7ff7e8112b \
--hash=sha256:ac4008d396bc9cd983ea483cb7139c0240a07bbc74ffb6232fceffedc6cf03a8
# via -r requirements-build.in
wheel==0.38.4 \
--hash=sha256:965f5259b566725405b05e7cf774052044b1ed30119b5d586b2703aafe8719ac \
--hash=sha256:b60533f3f5d530e971d6737ca6d58681ee434818fab630c83a734bb10c083ce8
# via -r requirements-build.in
Adding a requirements-build.txt should not require changes in your build process. Pip should install the build dependencies automatically as needed, you don't have to install them explicitly. The purpose of requirements-build.txt is to make Cachi2 fetch the build dependencies and provide them to pip for offline installation.
See also usage.md for a complete example of Cachi2 usage.
Cachi2 downloads the Python dependencies into the deps/pip/ subpath of the output directory. The directory is a flat list of the downloaded distributions of your runtime and build dependencies.
cachi2-output/deps/pip
├── certifi-2022.12.7.tar.gz
├── ...
├── pdm-pep517-1.0.6.tar.gz
├── requests-2.28.2.tar.gz
├── ...
└── wheel-0.38.4.tar.gz
To make pip use the downloaded archives, use the --find-links
and --no-index options. The --find-links option
tells pip to look for dependency archives in a directory, --no-index prevents pip from preferring PyPI over the local
directory. Pip also accepts environment variables; Cachi2 generates PIP_FIND_LINKS
and PIP_NO_INDEX
for you. See
usage: generate environment variables for more details.
It gets a bit trickier with external dependencies. Pip does not respect the --find-links option for dependencies specified via urls. Instead, Cachi2 rewrites your requirements.txt file(s) in place to replace the urls with file paths (after you call the cachi2 inject-files subcommand).
- dockerfile-parse @ https://github.com/.../2.0.0.tar.gz \
+ dockerfile-parse @ file:///absolute-path/cachi2-output/deps/pip/.../dockerfile-parse-...tar.gz
External dependencies are stored a bit further down the deps/pip tree to avoid mixing them with PyPI dependencies. The path and filename is an implementation detail.
cachi2-output/deps/pip
├── ...
├── external-dockerfile-parse
│ └── dockerfile-parse-external-sha256-36e4469abb0d96b0e3cd656284d5016e8a674cd57b8ebe5af64786fe63b8184d.tar.gz
└── ...
Common issues you may face when fetching dependencies or when installing the fetched dependencies.
First, please make sure that your project meets Cachi2's requirements (this document) and that you are using Cachi2 as intended (usage.md).
Have you read Building from source?
Even if you have all the build dependencies available, installing from source can come with unforeseen complications. Pip's --no-binary flag can help debug faster.
# on your machine
virtualenv venv && source venv/bin/activate
# or in a container
podman run --rm -ti -v "$PWD:$PWD:z" -w "$PWD" ubi8/python-39 bash
pip install --no-binary :all: -r requirements.txt
Notably, older versions of pip and setuptools have a fair share of bugs related to PEP 517 handling. A good first course of action can be to upgrade pip and setuptools and try again.
Other pip install options such as --use-pep517 may also be of interest.
Problem: you've found out that some build errors are caused by bugs in an older pip version. But the base image for
your container build comes with pip==<old>
and you cannot upgrade during the build because you're building with
network isolation.
Solution: make Cachi2 fetch a newer pip for you. Then you can upgrade pip from the prefetched archive.
# add to requirements-build.txt or use a separate file
pip==22.3.1 --hash=...
RUN source /tmp/cachi2.env && \
pip install -U pip && \
pip install .
You can use a similar approach to upgrade setuptools or other build dependencies before installing your app. Build dependencies other than pip should be part or requirements-build.txt already.
Building dependencies written in C typically requires gcc, CPython headers and other development libraries. Cachi2 does not fetch these, getting them into the build is up to you. The best case scenario, if you're building a container, is that the base image already contains everything you need. For example, the ubi8/python-39 image contains most of the typical development libraries.
To find out what non-Python dependencies you need, try to pip install --no-binary :all:
in a clean environment
(e.g. a container) as shown above. The error messages you get should
hopefully point you to the required dependencies.
For dependencies compiled from other languages, such as Rust, we don't know of any good solutions for offline installation. If you do manage to make it work, please let us know.
Some projects do not distribute sdists to PyPI. For example, tensorflow (as of version 2.11.0) distributes only wheels.
Possible workarounds:
- Enable pre-fetching wheels using
"allow_binary": "true"
in JSON input. - Find the git repository for the project, get the source tarball for a release. In requirements.txt, specify the dependency via an https url.
- tensorflow==2.11.0
+ tensorflow @ https://github.com/tensorflow/tensorflow/archive/refs/tags/v2.11.0.tar.gz \
+ --hash=sha256:99c732b92b1b37fc243a559e02f9aef5671771e272758aa4aec7f34dc92dac48