-
Notifications
You must be signed in to change notification settings - Fork 721
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support "--no-build-isolation" when installing packages #1715
Comments
I hope this is fixed soon since |
We'll support it. Assigning to myself. |
I'd call this a workaround, not a problem solved... and note this approach would not be endorsed by the PyPA. I wonder though why these tools can't do ensure pytorch is installed inside the build wheel isolated environment, and they must have access to the target Python instead. |
Numpy is more foundational library with similar interesting workarounds (oldest-supported-numpy). What build isolated environment has doesn't really matter because today there's no way to say that build environment and runtime environment for library must be same. These libraries care about runtime version of pytorch that will be used. They may support older pytorch versions, but build process can do optimizations specific to actual pytorch used at runtime. The pytorch present at build time does not matter to them. So build isolation here is in direct conflict with this use case. A packaging standard here would ideally be a way for specific dependencies to specify that their build vs runtime version are same. Most dependencies isolation is fine, only rare few that are sensitive to this. edit: If your intent is access to target python to dynamically inspect runtime pytorch that may work, but also seems like violation of intent of build isolation and would still leave a separate pytorch in build environment as not needed. |
Yeah, +1 to the problem mdrissi mentions. The PyPI ecosystem doesn't really work well for this kind of thing. We have custom logic that wraps and handles all of this stuff at work (e.g. it uses things like #1415 as an escape hatch). Although note we don't actually use |
If we could guarantee that we install the install the same pytorch version in the build env as in the target env (assuming we manage the target env torch version) and we reflink/hardlink so installing is fast and doesn't take disk space, would that work for you? I don't think we can cover all use cases that |
This is the only issue keeping me from using uv for everything. |
PR is up: #2258 |
Wow, thanks @charliermarsh! Will test this on |
## Summary This PR adds support for pip's `--no-build-isolation`. When enabled, build requirements won't be installed during PEP 517-style builds, but the source environment _will_ be used when executing the build steps themselves. Closes #1715.
This just went out in v0.1.16. Feedback welcome as always. |
Hi, I'm finding that --no-build-isolation makes it so that nothing in the pyproject.toml is processed. For example, we have this pyproject.toml
When we use Is there a way to assume that the dependencies are already installed, but still run setuptools-git-versioning etc? Thanks |
May I know how to install I tried
|
@hcoona -- I think this isn't possible right now. I'll file an issue about it. I think you'll have better luck using
|
uv --version
:0.1.5
This is an official issue to track a separate issue discovered in #1582. It was requested to be tracked separately here: #1582 (comment)
Short version:
Without
--no-build-isolation
, many popular ML libraries, includingflash-attn
can't bepip installed
.I want to be able to do this:
uv pip install flash-attn --no-build-isolation
But I can't.
If
uv pip install
doesn't support this, I don't think that it will support installing some popular ML and Deep Learning python modules.Disclaimer
I just discovered this stuff in about 1 hour of reading, and my python-dependency knowledge is pretty poor. I think this is a good description of the issue, but know that this is NOT coming from an expert.
Long version
The "newer" versions of pip enable build-isolation by default when modules are being installed. (BIG discussion here for those interested in the history: pypa/pip#8437)
This is generally pretty nice, because it allows a python module to request a newer version of
setuptools
or something at install time, without polluting a user's global environment by permanently installing an undesired newer version of a module.The downside is that it hides all existing installed modules from the module's setup.py script.
The typical solution to this is to tell
pyproject.toml
about the modules required at setup time. It will install them in the isolated environment, and then proceed with the installation.There is a bigger downside though:
A number of machine learning-related modules Flash-attn, NVidia Apex need pytorch to be installed before they are installed, so that they can do things like compile against version of pytorch currently installed in the user's environment.
You might think that these modules should declare that they depend upon Pytorch at setup time in their pyproject.toml file, but that isn't a good solution. No one wants a newly installed Pytorch version, and they might have installed pytorch using Conda or some other mechanism that pip isn't aware of. And these modules want to just work with whatever version of pytorch is already there. It would cause a LOT more problems than it would solve.
So, modules like these simply instruct users to disable this build isolation by running
pip install
with--no-build-isolation
. Problem solved! Butuv
doesn't support this. (As discovered in #1582).(This issue generally manifests as an error when setup.py can't being find the
packaging
module, and users being confused because they already ranpip install packaging
. See: Dao-AILab/flash-attention#453That's exactly the error I get when I
uv pip install flash-attn
:ModuleNotFoundError: No module named 'packaging'
The text was updated successfully, but these errors were encountered: