Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Promoting experimental_requirement_cycles from experimental #1663

Open
Philip-Murzynowski opened this issue Jan 2, 2024 · 4 comments
Open

Comments

@Philip-Murzynowski
Copy link

Philip-Murzynowski commented Jan 2, 2024

🚀 feature request

Relevant Rules

I have been making use of the experimental_requirement_cycles parameter for the pip_parse rule and greatly benefiting from it. I wanted to ask if there is a release plan or anything I could look into to help incorporate it into a release.

Description

The experimental feature is introduced here in the CHANGELOG. I saw that recommendation for requesting a feature to be taken out of experimental was to open an issue as noted in the support.md.

Describe the solution you'd like

As this is my first activity in this repository, please let me know if you would like more information.

Describe alternatives you've considered

@arrdem
Copy link
Contributor

arrdem commented Jan 23, 2024

As the primary author of that feature I'm 50/50 on promoting it to stable. It's a fine solution for making true dependency cycles work (eg. a <-> b in all cases of extras requirements). I'm less convinced that it's a good solution for making requirements using extras work since we have to convert what should be optional dependencies into strong dependencies.

Consider a[] requiring b with b requiring a, and a[c] requiring c with c requiring a. py_requirement("a") is now ambiguous. Does the user intend a[c]? a[]? Do we intend the requirements.txt locks to function as virtualenvs from which everything* is installed or as a dependency solution set from which only the needed requirements should be installed?

Personally I prefer the latter. I think having to merge all the Airflow optional dependencies into a single requirement group isn't a great solution because it means that any artifact which requires Airflow takes a dependency on the entire group of requirements. It would be better if say the mysql extra wasn't always implied because py_requirement("airflow") now means the entire group with all extras.

There's plenty of room to critique setuptools/pip here for extras really behaving as just sneaky transitive deps not proper package configuration https://discuss.python.org/t/provide-information-on-installed-extras-in-package-metadata/15385.

There's the related problem of needing to merge packaging cycles when there are multiple, again see the newer Airflow packaging. Is that really the right behavior? Or could we do something clever to break these cycles into separate components which would behave more reasonably as optional extras?

aignas added a commit to aignas/rules_python that referenced this issue Apr 17, 2024
With this change we can in theory have multi-platform libraries in the
dependency cycle and use the pip hub repo for the dependencies. With
this we can also make the contents of `whl_library` not depend on what
platform the actual dependencies are. This allows us to support the
following topologies:

* A platform-specific wheel depends on cross-platform wheel.
* A cross-platform wheel depends on cross-platform wheel.
* A whl_library can have `select` dependencies based on the interpreter
  version, e.g. pull in a `tomli` dependency only when the Python
  interpreter is less than 3.11.

Relates to bazelbuild#1663.
Work towards bazelbuild#735.
github-merge-queue bot pushed a commit that referenced this issue Apr 30, 2024
…#1856)

With this change we can in theory have multi-platform libraries in the
dependency cycle and use the pip hub repo for the dependencies. With
this we can also make the contents of `whl_library` not depend on what
platform the actual dependencies are. This allows us to support the
following topologies:

* A platform-specific wheel depends on cross-platform wheel.
* A cross-platform wheel depends on cross-platform wheel.
* A whl_library can have `select` dependencies based on the interpreter
  version, e.g. pull in a `tomli` dependency only when the Python
  interpreter is less than 3.11.

Relates to #1663.
Work towards #735.
@aignas
Copy link
Collaborator

aignas commented Jul 1, 2024

Some thoughts about a better solution here: #1728 (comment)

Not sure yet how to integrate the current experimental_index_url with the code snippet in there, but it would be an alternative that would not require the users to specify the experimental_requirement_cycles at all and it would be my preferred solution.

github-merge-queue bot pushed a commit that referenced this issue Aug 15, 2024
Before this change the `all_requirements` and related constants will
have
packages that need to be installed only on specific platforms and will
mean
that users relying on those constants (e.g. `gazelle`) will need to do
extra
work to exclude platform-specific packages. The package managers that
that
support outputting such files now include `uv` and `pdm`. This might be
also
useful in cases where we attempt to handle non-requirements lock files.

Note, that the best way to handle this would be to move all of the
requirements
parsing code to Python, but that may cause regressions as it is a much
bigger
change. This is only changing the code so that we are doing extra
processing
for the requirement lines that have env markers. The lines that have no
markers
will not see any change in the code execution paths and the python
interpreter
will not be downloaded.

We also use the `*_ctx.watch` API where available to correctly
re-evaluate the
markers if the `packaging` Python sources for this change.

Extra changes that are included in this PR:
- Extend the `repo_utils` to have a method for `arch` getting from the
`ctx`.
- Change the `local_runtime_repo` to perform the validation not relying
on the
  implementation detail of the `get_platforms_os_name`.
- Add `$(UV)` make variable for the `uv:current_toolchain` so that we
can
  generate the requirements for `sphinx` using `uv`.
- Swap the requirement generation using `genrule` and `uv` for `sphinx`
and co
so that we can test the `requirement` marker code. Note, the
`requirement`
  markers are not working well with the `requirement_cycles`.

Fixes #1105.
Fixes #1868.
Work towards #260, #1975.
Related #1663.

---------

Co-authored-by: Richard Levasseur <rlevasseur@google.com>
@gholms
Copy link
Contributor

gholms commented Oct 21, 2024

This came up as a potential solution for linking type stub wheels with their corresponding main wheels to ensure they are present for rules which run static type checkers like mypy. At present there is no clear way to do this; the community varies from requiring one to supply the transitive closure of stub wheels to individual test rules to automagically building a fake site-packages directory to throwing our hands up in the air. The most appropriate place to do this is in pip_parse, the code which actually provides wheels to bazel.

While we could add a bespoke mechanism for metadata wheels like these, experimental_requirement_cycles is already a means of grouping wheels together which can handle this use case. Perhaps switching to a more generic name such as whl_groups as we stabilize this would be appropriate.

This again raises the question of how to handle optional dependencies, but from a different perspective from extras. Type stubs are more or less useful in an "all-or-nothing" way, making build settings an attractive option. A solution which supports select statements would facilitate this quite well. That said, we don't necessarily have to block stabilization on solving this problem. It is enough to simply consider it in the design.

@aignas
Copy link
Collaborator

aignas commented Dec 10, 2024

Just to complete the option list for above stub proposal - #2425 has landed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants