Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add options for including build dependencies in compiled output #1681

Merged
merged 3 commits into from
Nov 1, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ repos:
- build==1.0.0
- pyproject_hooks==1.0.0
- pytest==7.4.2
language_version: python3.8
- repo: https://github.com/PyCQA/bandit
rev: 1.7.5
hooks:
Expand Down
66 changes: 66 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -548,6 +548,70 @@ dependencies, making any newly generated `requirements.txt` environment-dependen
As a general rule, it's advised that users should still always execute `pip-compile`
on each targeted Python environment to avoid issues.

### Maximizing reproducibility

`pip-tools` is a great tool to improve the reproducibility of builds.
But there are a few things to keep in mind.

- `pip-compile` will produce different results in different environments as described in the previous section.
- `pip` must be used with the `PIP_CONSTRAINT` environment variable to lock dependencies in build environments as documented in [#8439](https://github.com/pypa/pip/issues/8439).
- Dependencies come from many sources.

Continuing the `pyproject.toml` example from earlier, creating a single lock file could be done like:

```console
$ pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml
#
# This file is autogenerated by pip-compile with Python 3.9
# by the following command:
#
# pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml
#
asgiref==3.5.2
# via django
attrs==22.1.0
# via pytest
backports-zoneinfo==0.2.1
# via django
django==4.1
# via my-cool-django-app (pyproject.toml)
webknjaz marked this conversation as resolved.
Show resolved Hide resolved
editables==0.3
# via hatchling
hatchling==1.11.1
# via my-cool-django-app (pyproject.toml::build-system.requires)
iniconfig==1.1.1
# via pytest
packaging==21.3
# via
# hatchling
# pytest
pathspec==0.10.2
# via hatchling
pluggy==1.0.0
# via
# hatchling
# pytest
py==1.11.0
# via pytest
pyparsing==3.0.9
# via packaging
pytest==7.1.2
# via my-cool-django-app (pyproject.toml)
sqlparse==0.4.2
# via django
tomli==2.0.1
# via
# hatchling
# pytest
```

Some build backends may also request build dependencies dynamically using the `get_requires_for_build_` hooks described in [PEP 517] and [PEP 660].
This will be indicated in the output with one of the following suffixes:

- `(pyproject.toml::build-system.backend::editable)`
- `(pyproject.toml::build-system.backend::sdist)`
webknjaz marked this conversation as resolved.
Show resolved Hide resolved
- `(pyproject.toml::build-system.backend::wheel)`

### Other useful tools

- [pip-compile-multi](https://pip-compile-multi.readthedocs.io/en/latest/) - pip-compile command wrapper for multiple cross-referencing requirements files.
Expand Down Expand Up @@ -600,5 +664,7 @@ note that it is deprecated and will be removed in a future release.
[Matrix Space]: https://matrix.to/#/%23jazzband:matrix.org
[pip-tools-overview]: https://github.com/jazzband/pip-tools/raw/main/img/pip-tools-overview.svg
[environment-markers]: https://peps.python.org/pep-0508/#environment-markers
[PEP 517]: https://peps.python.org/pep-0517/
[PEP 660]: https://peps.python.org/pep-0660/
[discord-chat]: https://discord.gg/pypa
[discord-chat-image]: https://img.shields.io/discord/803025117553754132?label=Discord%20chat%20%23pip-tools&style=flat-square
40 changes: 40 additions & 0 deletions examples/readme/constraints.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
#
# This file is autogenerated by pip-compile with Python 3.11
# by the following command:
#
# pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml
#
asgiref==3.5.2
# via django
attrs==22.1.0
# via pytest
django==4.1
# via my-cool-django-app (pyproject.toml)
editables==0.3
# via hatchling
hatchling==1.11.1
# via my-cool-django-app (pyproject.toml::build-system.requires)
iniconfig==1.1.1
# via pytest
packaging==21.3
# via
# hatchling
# pytest
pathspec==0.10.2
# via hatchling
pluggy==1.0.0
# via
# hatchling
# pytest
py==1.11.0
# via pytest
pyparsing==3.0.9
# via packaging
pytest==7.1.2
# via my-cool-django-app (pyproject.toml)
sqlparse==0.4.2
# via django
tomli==2.0.1
# via
# hatchling
# pytest
11 changes: 11 additions & 0 deletions examples/readme/pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[project]
name = "my-cool-django-app"
version = "42"
dependencies = ["django"]

[project.optional-dependencies]
dev = ["pytest"]
174 changes: 174 additions & 0 deletions piptools/build.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,174 @@
from __future__ import annotations

import collections
import contextlib
import pathlib
import sys
import tempfile
from dataclasses import dataclass
from importlib import metadata as importlib_metadata
from typing import Any, Iterator, Protocol, TypeVar, overload

import build
import build.env
import pyproject_hooks
from pip._internal.req import InstallRequirement
from pip._internal.req.constructors import install_req_from_line, parse_req_from_line

PYPROJECT_TOML = "pyproject.toml"

_T = TypeVar("_T")


if sys.version_info >= (3, 10):
from importlib.metadata import PackageMetadata
else:

class PackageMetadata(Protocol):
@overload

Check warning on line 28 in piptools/build.py

View check run for this annotation

Codecov / codecov/patch

piptools/build.py#L27-L28

Added lines #L27 - L28 were not covered by tests
def get_all(self, name: str, failobj: None = None) -> list[Any] | None:
...

@overload

Check warning on line 32 in piptools/build.py

View check run for this annotation

Codecov / codecov/patch

piptools/build.py#L32

Added line #L32 was not covered by tests
def get_all(self, name: str, failobj: _T) -> list[Any] | _T:
...


@dataclass
class ProjectMetadata:
extras: tuple[str, ...]
requirements: tuple[InstallRequirement, ...]
build_requirements: tuple[InstallRequirement, ...]


def build_project_metadata(
src_file: pathlib.Path,
build_targets: tuple[str, ...],
*,
isolated: bool,
quiet: bool,
) -> ProjectMetadata:
"""
Return the metadata for a project.
Uses the ``prepare_metadata_for_build_wheel`` hook for the wheel metadata
if available, otherwise ``build_wheel``.
Uses the ``prepare_metadata_for_build_{target}`` hook for each ``build_targets``
if available.
:param src_file: Project source file
:param build_targets: A tuple of build targets to get the dependencies
of (``sdist`` or ``wheel`` or ``editable``).
:param isolated: Whether to run invoke the backend in the current
environment or to create an isolated one and invoke it
there.
:param quiet: Whether to suppress the output of subprocesses.
"""

src_dir = src_file.parent
with _create_project_builder(src_dir, isolated=isolated, quiet=quiet) as builder:
metadata = _build_project_wheel_metadata(builder)
extras = tuple(metadata.get_all("Provides-Extra") or ())
requirements = tuple(
_prepare_requirements(metadata=metadata, src_file=src_file)
)
build_requirements = tuple(
_prepare_build_requirements(
builder=builder,
src_file=src_file,
build_targets=build_targets,
package_name=_get_name(metadata),
)
)
return ProjectMetadata(
extras=extras,
requirements=requirements,
build_requirements=build_requirements,
)


@contextlib.contextmanager
def _create_project_builder(
src_dir: pathlib.Path, *, isolated: bool, quiet: bool
) -> Iterator[build.ProjectBuilder]:
if quiet:
runner = pyproject_hooks.quiet_subprocess_runner
else:
runner = pyproject_hooks.default_subprocess_runner

if not isolated:
yield build.ProjectBuilder(src_dir, runner=runner)
return

with build.env.DefaultIsolatedEnv() as env:
builder = build.ProjectBuilder.from_isolated_env(env, src_dir, runner)
env.install(builder.build_system_requires)
env.install(builder.get_requires_for_build("wheel"))
webknjaz marked this conversation as resolved.
Show resolved Hide resolved
Comment on lines +106 to +107
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@apljungquist looks like these installs have to be called with os.environ['PIP_CONSTRAINT'] set to a generated file with -P values dumped in order to install the backend deps we desire.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@webknjaz Checking I understand this comment as being related to this note in the README: "- pip must be used with the PIP_CONSTRAINT environment variable to lock dependencies in build environments as documented in #8439."

Rather than just installing the requirements as listed in the metadata, you're suggesting that:

  1. The current value of `os.environ["PIP_CONSTRAINT"] should be saved to avoid irrevocably clobbering it
  2. For each env.install step, in addition to the requirements being passed explicitly, they also need to be added to a temporary file referenced from PIP_CONSTRAINT to force the dependencies to actually be respected
  3. Once the installs are done, PIP_CONSTRAINT should be reverted back to its original value.

To try to get CI going, I'm going to try just setting PIP_CONSTRAINT in the failing test case, which would mean this discovery still needs its own bug report.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Attempted workaround in bc3f17b still didn't get the affected test case passing when testing locally, but I'm trying it in CI anyway in case the local failure was specific to Python 3.12.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried a different workaround, specifying a known version of setuptools in the test metadata: c2bd925

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Even with both attempted workarounds combined, the build dependency compilation test still fails: https://github.com/jazzband/pip-tools/pull/2105/files#diff-99b06d953a9978ee564df2e67547b8a76f3b1ab116ab7c89a9005b9239f2a904

This is with setuptools 70.0.0 explicitly requested in the test project config, and with a constraints file constraining both setuptools and wheel set via the pip-tools CLI option and via the PIP_CONSTRAINT environment variable.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ncoghlan

  1. For each env.install step, in addition to the requirements being passed explicitly, they also need to be added to a temporary file referenced from PIP_CONSTRAINT to force the dependencies to actually be respected

I think that the [build-system]requires value is there already as an explicit install request. But the constraints coming from other places like the CLI command or other/additional input files are not taken into account. I think they should be dumped into a temporary file and that file would need to be set in PIP_CONSTRAINT around such invocations.

Yes to points (1) and (3), though.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@webknjaz Your suggestion to set the PIP_CONSTRAINT environment variable appears to work, but another problem surfaced; the failing test in question doesn't use build isolation (it succeeds when that is explicitly requested) – what do you suggest should be done in such a case?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@chrysle oh, I missed this. Could you clarify what you mean by requesting it? Are you talking about calling some tooling outside the test suite or making said change in the tests? Perhaps a PR would make it more obvious what you mean.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No problem! I was referring to the --build-isolation boolean flag. The failing test explicitly turned that off:

"--no-build-isolation",

However, this means that _create_project_builder yields build.ProjectBuilder relying on the outside environment and the (latest) version of setuptools that was installed together with pip-tools - no doubt to improve performance - but the test naturally fails.

pip-tools/piptools/build.py

Lines 192 to 194 in 5330964

if not isolated:
yield build.ProjectBuilder(src_dir, runner=runner)
return

Using build isolation makes the test pass after setting PIP_CONSTRAINT as you suggested.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@chrysle thanks for the explanation! Yes, you're right that this was a trade-off made in the name of performance. It sounds like we need a separate regression/acceptance test checking this corner case in isolation. It'll be slower but, hopefully, one test won't make a big difference.

yield builder


def _build_project_wheel_metadata(
builder: build.ProjectBuilder,
) -> PackageMetadata:
with tempfile.TemporaryDirectory() as tmpdir:
path = pathlib.Path(builder.metadata_path(tmpdir))
return importlib_metadata.PathDistribution(path).metadata


def _get_name(metadata: PackageMetadata) -> str:
retval = metadata.get_all("Name")[0] # type: ignore[index]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting.. The pre-commit.ci service reports piptools/build.py:120: error: Unused "type: ignore" comment [unused-ignore] here. And the GHA CI doesn't. This makes me think that mypy runs against different Pythons in these envs. So maybe, it's unnecessary to have this here. Or, maybe use the --python-version mypy setting to make both test against the same interpreter version.. WDYT @apljungquist?

Suggested change
retval = metadata.get_all("Name")[0] # type: ignore[index]
retval = metadata.get_all("Name")[0]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think specifying the language version is a good option since it makes it easier to understand what is run in CI and I have pushed a commit that I hope works.

assert isinstance(retval, str)
return retval


def _prepare_requirements(
metadata: PackageMetadata, src_file: pathlib.Path
) -> Iterator[InstallRequirement]:
package_name = _get_name(metadata)
comes_from = f"{package_name} ({src_file})"
package_dir = src_file.parent

for req in metadata.get_all("Requires-Dist") or []:
parts = parse_req_from_line(req, comes_from)
if parts.requirement.name == package_name:
# Replace package name with package directory in the requirement
# string so that pip can find the package as self-referential.
# Note the string can contain extras, so we need to replace only
# the package name, not the whole string.
replaced_package_name = req.replace(package_name, str(package_dir), 1)
parts = parse_req_from_line(replaced_package_name, comes_from)

yield InstallRequirement(
parts.requirement,
comes_from,
link=parts.link,
markers=parts.markers,
extras=parts.extras,
)


def _prepare_build_requirements(
builder: build.ProjectBuilder,
src_file: pathlib.Path,
build_targets: tuple[str, ...],
package_name: str,
) -> Iterator[InstallRequirement]:
result = collections.defaultdict(set)

# Build requirements will only be present if a pyproject.toml file exists,
# but if there is also a setup.py file then only that will be explicitly
# processed due to the order of `DEFAULT_REQUIREMENTS_FILES`.
src_file = src_file.parent / PYPROJECT_TOML

for req in builder.build_system_requires:
result[req].add(f"{package_name} ({src_file}::build-system.requires)")
for build_target in build_targets:
for req in builder.get_requires_for_build(build_target):
result[req].add(
f"{package_name} ({src_file}::build-system.backend::{build_target})"
)

for req, comes_from_sources in result.items():
for comes_from in comes_from_sources:
yield install_req_from_line(req, comes_from=comes_from)
Loading
Loading