-
Notifications
You must be signed in to change notification settings - Fork 633
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I tried upgrade visual studio,cuda 12.4 and 12.6,but it still can't build the wheel,and return this #1117
Comments
I don't think that output is enough for us to figure out what went wrong. Could you include more of the logs? Also, please note that Windows support is best-effort since we're not experts in it. |
PS C:\WINDOWS\system32> pip install xformers × python setup.py bdist_wheel did not run successfully. note: This error originates from a subprocess, and is likely not a problem with pip. |
Thanks. There's nothing that looks especially amiss, except perhaps two things:
|
The error checking compiler version is almost certainly because ninja and isn't installed and just aren't finding the compiler right at that stage. Its setup.py has an explicit hook / extension to ninja that it uses to build the extension rather than letting the build system use it as an option to accelerate the build, so it's a requirement. Ninja is actually detrimental to build speeds on Windows but it's easier to just deal with it. I mean it's part of the setup_requires in setup.py for flash_attention even... except I think it needs to be in a different _requires section or in pyproject.toml's [build-system] requires [ or it would be installing it to do the build. setup_requires=[
"packaging",
"psutil",
"ninja",
], The only issue I see with the compiler version is that you're using the 32-bit host compiler that targets x64. This project needs a ton of memory to build so you probably shouldn't do that. It shouldn't even be choosing it unless you're trying to install from a visual studio command prompt; flash attention builds will break if you run from the compiler environment because ninja duplicates options (but visual studio / nvcc handles all that stuff in its own integrations), so go to a clean prompt, make sure the compiler isn't on your default path, and try again if just installing ninja doesn't fix it. Also it's not quite related, but something is injecting all of these disable-warning commands to both compilers; it doesn't appear to be xformers or flash attention but they range from bone-stupid to meh in terms of not being warned about them. The narrowing conversions are the worst: Unfortunately I can't give any real suggestions on removing these from the ninja compiler commands since I don't know what is generating them or why. I always made it a point to build with /Werror on C++ projects and fix everything so I'm pedantic. |
Torch's minimum compiler version is 19.0.24215 btw. The highest version I have is the same 14.41.34120 you have, which is compiler version 19.41.34120, the lowest is 19.29.30154 from march of this year, so no version issues. The only way torch throws that error on windows is if the version it actually detects is too low, which is possible if you have one from an older VS version set on the path on purpose but not likely, or if it errors running CL.exe or parsing the 3 digit number out with a regex. I suppose it could also happen with the windows store python version which wouldn't be able to call executables outside its own directory tree, but since torch also calls LoadModuleA on kernel32.dll at runtime and directly looks up functions in it I doubt it would work if that were the case. |
❓ Questions and Help
C:\Users\npc23\AppData\Local\Temp\pip-install-h1q0rzya\xformers_805f34a4f57944f49849e632baae6a19\third_party\flash-attention\csrc\flash_attn\flash_api.cpp : fatal error C1083: 无法打开编译器生成的文: “”: Invalid argument
error: command 'C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX86\x64\cl.exe' failed with exit code 1
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for xformers
Running setup.py clean for xformers
Failed to build xformers
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (xformers)
The text was updated successfully, but these errors were encountered: