Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hey, you! The people that made this thing. You know that Torch v 2.4.1 is out, right?? #1096

Open
Xephier102 opened this issue Sep 8, 2024 · 8 comments

Comments

@Xephier102
Copy link

Xephier102 commented Sep 8, 2024

❓ Questions and Help

I keep trying to install this thing to get rid of the errors when I run Automatic1111, but this stupid thing keeps installing outdated torch, and uninstalling my up to date torch along with Cuda. I did try installing 2.4.0 with Cuda at one point, and iirc, that's when automatic wouldn't launch at all. I'm so tired of this. I even tried installing the latest dev release of this, and still, it installs outdated pytorch. Cmon.. why bother making a new version at all if you aren't going to make it up to date... I mean, the dev version isn't even a stable release yet, and the one I got was the least stable (newest of the two), and it's still outdated..

Something that's new shouldn't force you to install something that's old.. Isn't that just Dev 101?

 WARNING:xformers:WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for:
    PyTorch 2.4.0+cu121 with CUDA 1201 (you have 2.4.1+cu118)
    Python  3.10.11 (you have 3.10.6)
  Please reinstall xformers (see https://github.com/facebookresearch/xformers#installing-xformers)
  Memory-efficient attention, SwiGLU, sparse and more won't be available.

And that's on the latest dev version..

@KUSH42
Copy link

KUSH42 commented Sep 9, 2024

You can easily fix it by editing the MANIFEST file of the package. Use pip show xformers to know where to look.
Change Requires-Dist: torch ==2.4.0 in Line 19 to Requires-Dist: torch >=2.4.0.

But yeah, it's somewhat irritating

@Xephier102
Copy link
Author

Xephier102 commented Sep 10, 2024 via email

@DraconicDragon
Copy link

DraconicDragon commented Sep 10, 2024

You can easily fix it by editing the MANIFEST file of the package. Use pip show xformers to know where to look. Change Requires-Dist: torch ==2.4.0 in Line 19 to Requires-Dist: torch >=2.4.0.

Thanks, this worked for me (although its the METADATA file in the xformers dist-info folder, not MANIFEST). Also for the person above me, 'torch >=2.4.0' means xformers will accept any torch version that is 2.4.0 or above but putting 2.4.1 should work anyway (I tested both). Maybe you forgot to save the file or you aren't in the correct environment?

@Xephier102
Copy link
Author

You can easily fix it by editing the MANIFEST file of the package. Use pip show xformers to know where to look. Change Requires-Dist: torch ==2.4.0 in Line 19 to Requires-Dist: torch >=2.4.0.
Thanks, this worked for me (although its the METADATA file in the xformers dist-info folder, not MANIFEST). Also for the person above me, 'torch >=2.4.0' means xformers will accept any torch version that is 2.4.0 or above but putting 2.4.1 should work anyway (I tested both). Maybe you forgot to save the file or you aren't in the correct environment?

Well, I'm not generally the type of person that can fix something with that 'did you remember to plug it in?' solution. I'm not a genius at this, but I definitely saved the file several times over at this point. As for the 'correct environment', I did it with notepad to the METADATA folder in E:\auto1111\stable-diffusion-webui\venv\Lib\site-packages\xformers-0.0.28.dev895.dist-info

At first I didn't see the > symbol he added to the replacement text, hence I was confused that it wasn't 2.4.1, since nothing seemed changed. But I did add that today, and got the same results;

[+] xformers version 0.0.28.dev895 installed.
Launching Web UI with arguments:
WARNING:xformers:WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for:
PyTorch 2.4.0+cu121 with CUDA 1201 (you have 2.4.1+cu118)
Python 3.10.11 (you have 3.10.6)
Please reinstall xformers (see https://github.com/facebookresearch/xformers#installing-xformers)
Memory-efficient attention, SwiGLU, sparse and more won't be available.
Set XFORMERS_MORE_DETAILS=1 for more details
No module 'xformers'. Proceeding without it.
*** Error loading script: trt.py
Traceback (most recent call last):
File "E:\auto1111\stable-diffusion-webui\modules\scripts.py", line 515, in load_scripts
script_module = script_loading.load_module(scriptfile.path)
File "E:\auto1111\stable-diffusion-webui\modules\script_loading.py", line 13, in load_module
module_spec.loader.exec_module(module)
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "E:\auto1111\stable-diffusion-webui\extensions\Stable-Diffusion-WebUI-TensorRT\scripts\trt.py", line 13, in

And here's the exact contents of the METADATA;

Metadata-Version: 2.1
Name: xformers
Version: 0.0.28.dev895
Summary: XFormers: A collection of composable Transformer building blocks.
Home-page: https://facebookresearch.github.io/xformers/
Author: Facebook AI Research
Author-email: oncall+xformers@xmail.facebook.com
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: License :: OSI Approved :: BSD License
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Operating System :: OS Independent
Requires-Python: >=3.7
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: numpy
Requires-Dist: torch >=2.4.1

XFormers: A collection of composable Transformer building blocks.XFormers aims at being able to reproduce most architectures in the Transformer-family SOTA,defined as compatible and combined building blocks as opposed to monolithic models

Really not sure what else to do now.

@DraconicDragon
Copy link

DraconicDragon commented Sep 11, 2024

xFormers can't load C++/CUDA extensions. xFormers was built for: PyTorch 2.4.0+cu121 with CUDA 1201 (you have 2.4.1+cu118)

Did you previously use PyTorch 2.4.0+cu121 and then upgrade to 2.4.1 compiled with CUDA 11.8? Or did you install an xformers version specifically compiled for 2.4.0+cu121?
If so the easiest fix is probably reinstalling xformers pip install xformers --force-reinstall --no-deps OR installing torch 2.4.1+cu121 to match your xformers (assuming your GPU supports CUDA 12.1)
As far as I know, the pip package of xformers should be universal and work across all pytorch 2.4.0(and .1) CUDA versions
(if this helps anyone in some way: im using xformers==0.0.27.post2 and torch2.4.1+cu124 but latest xformers==0.0.28.dev895 from pypi also worked)

@Xephier102
Copy link
Author

"assuming your GPU supports CUDA 12.1" I honestly haven't a clue. It doesn't seem to be the easiest info to find. Or I'm just blind. I got a https://www.techpowerup.com/gpu-specs/geforce-rtx-3070-ti-mobile.c3852 that. It was near top of the line a year ago, then the damn tech skyrocketed and AI art became a thing, and vram requirements skyrocketed. But yea. I been up for well over 24 hrs, 98% of that time was just trying to get this damn thing working. I had everything installed/upgraded that I wanted, and Xformers may have even worked, but the stupid commit hash thing wouldn't piss off so python could boot up the program. Just kept giving errors because I didn't have the software installed that IT wanted.. I tried for a long time just to get that code to go away. Edited the crap outta py files and such, Was almost afraid that if I managed to force it to work that I'd end up borking my laptop in the process.

Anyways, I be hallucinating stuff in my peripheral vision.. I gotta crash..

@Xephier102
Copy link
Author

Xephier102 commented Sep 11, 2024

Didn't even occur to me, this page doesn't actually have anything to do with auto1111. Is that what you're using? Also, what version of Python do ya got?

@DraconicDragon
Copy link

DraconicDragon commented Sep 11, 2024

yeah xformers itself doesnt have much to do with auto1111. also your gpu supports CUDA 11.8/12.1/12.4

my auto1111 install uses 3.10.14 (i mainly use sd webui forge with python 3.10.6, a fork of a1111 that is faster and supports newer models like Flux. i cant say anything about its stability tho)
both have torch 2.4.1+cu124 installed with xformers 0.0.27.post2 (i tried pip install xformers==0.0.28.dev895 --no-deps too, and had no issues)
i have an RTX 3060 12gb

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants
@KUSH42 @DraconicDragon @Xephier102 and others