-
Notifications
You must be signed in to change notification settings - Fork 543
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Promoting experimental_requirement_cycles
from experimental
#1663
Comments
As the primary author of that feature I'm 50/50 on promoting it to stable. It's a fine solution for making true dependency cycles work (eg. Consider Personally I prefer the latter. I think having to merge all the Airflow optional dependencies into a single requirement group isn't a great solution because it means that any artifact which requires Airflow takes a dependency on the entire group of requirements. It would be better if say the mysql extra wasn't always implied because There's plenty of room to critique setuptools/pip here for extras really behaving as just sneaky transitive deps not proper package configuration https://discuss.python.org/t/provide-information-on-installed-extras-in-package-metadata/15385. There's the related problem of needing to merge packaging cycles when there are multiple, again see the newer Airflow packaging. Is that really the right behavior? Or could we do something clever to break these cycles into separate components which would behave more reasonably as optional extras? |
With this change we can in theory have multi-platform libraries in the dependency cycle and use the pip hub repo for the dependencies. With this we can also make the contents of `whl_library` not depend on what platform the actual dependencies are. This allows us to support the following topologies: * A platform-specific wheel depends on cross-platform wheel. * A cross-platform wheel depends on cross-platform wheel. * A whl_library can have `select` dependencies based on the interpreter version, e.g. pull in a `tomli` dependency only when the Python interpreter is less than 3.11. Relates to bazelbuild#1663. Work towards bazelbuild#735.
…#1856) With this change we can in theory have multi-platform libraries in the dependency cycle and use the pip hub repo for the dependencies. With this we can also make the contents of `whl_library` not depend on what platform the actual dependencies are. This allows us to support the following topologies: * A platform-specific wheel depends on cross-platform wheel. * A cross-platform wheel depends on cross-platform wheel. * A whl_library can have `select` dependencies based on the interpreter version, e.g. pull in a `tomli` dependency only when the Python interpreter is less than 3.11. Relates to #1663. Work towards #735.
Some thoughts about a better solution here: #1728 (comment) Not sure yet how to integrate the current |
Before this change the `all_requirements` and related constants will have packages that need to be installed only on specific platforms and will mean that users relying on those constants (e.g. `gazelle`) will need to do extra work to exclude platform-specific packages. The package managers that that support outputting such files now include `uv` and `pdm`. This might be also useful in cases where we attempt to handle non-requirements lock files. Note, that the best way to handle this would be to move all of the requirements parsing code to Python, but that may cause regressions as it is a much bigger change. This is only changing the code so that we are doing extra processing for the requirement lines that have env markers. The lines that have no markers will not see any change in the code execution paths and the python interpreter will not be downloaded. We also use the `*_ctx.watch` API where available to correctly re-evaluate the markers if the `packaging` Python sources for this change. Extra changes that are included in this PR: - Extend the `repo_utils` to have a method for `arch` getting from the `ctx`. - Change the `local_runtime_repo` to perform the validation not relying on the implementation detail of the `get_platforms_os_name`. - Add `$(UV)` make variable for the `uv:current_toolchain` so that we can generate the requirements for `sphinx` using `uv`. - Swap the requirement generation using `genrule` and `uv` for `sphinx` and co so that we can test the `requirement` marker code. Note, the `requirement` markers are not working well with the `requirement_cycles`. Fixes #1105. Fixes #1868. Work towards #260, #1975. Related #1663. --------- Co-authored-by: Richard Levasseur <rlevasseur@google.com>
This came up as a potential solution for linking type stub wheels with their corresponding main wheels to ensure they are present for rules which run static type checkers like mypy. At present there is no clear way to do this; the community varies from requiring one to supply the transitive closure of stub wheels to individual test rules to automagically building a fake site-packages directory to throwing our hands up in the air. The most appropriate place to do this is in While we could add a bespoke mechanism for metadata wheels like these, This again raises the question of how to handle optional dependencies, but from a different perspective from extras. Type stubs are more or less useful in an "all-or-nothing" way, making build settings an attractive option. A solution which supports |
Just to complete the option list for above stub proposal - #2425 has landed. |
🚀 feature request
Relevant Rules
I have been making use of the
experimental_requirement_cycles
parameter for thepip_parse
rule and greatly benefiting from it. I wanted to ask if there is a release plan or anything I could look into to help incorporate it into a release.Description
The experimental feature is introduced here in the CHANGELOG. I saw that recommendation for requesting a feature to be taken out of experimental was to open an issue as noted in the support.md.
Describe the solution you'd like
As this is my first activity in this repository, please let me know if you would like more information.Describe alternatives you've considered
The text was updated successfully, but these errors were encountered: