-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running on Windows #3
Comments
I have tried it on cuda version 12.0 or higher. |
Hey, can you be more specific about what we would need to change? I'm not the brightest windows user. I managed to get an output by disabling flash attention in model_utils.py by setting use_flash_attention_2=False, but I had a lot of errors, and I would like to try your suggestion to change the torch installation script, but I don't know where to look :(. NVM I got it. After installing nvidia toolkit 12.1 I changed the initial install script's line. Instead of pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url https://download.pytorch.org/whl/cu118 I did pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url https://download.pytorch.org/whl/nightly/cu121 Then waited 45 mins for it to build the wheel and the gui is working in windows! |
The command you linked gives |
I've got it working on windows. What I did was instead of using that torch command, as the person above posted. I ran what you get from here: https://pytorch.org/get-started/locally/ Specifically the command is flash-attn took hours and hours to install for whatever reason but it's working now and I just finished running my first mesh through it. |
Also if you're getting anything about 11.8, then you should clear whatever MeshAnything project (dont know what they call that environment in python). Since you want to use 12.1. And make sure your flash-attn is a new version, I don't know what version it gets when you automatically install it but if it's below 2.4 you should install a new one. I'm on 2.5.9.post1. Earlier versions have issues on windows, you can read on their github page. I got it working perfect but I see now it's really not meant for anything except toy examples. It can't handle any complex meshes like scans or anything you'd use IRL. Look at the cherry picked examples in the pictures they've posted. Wish I had my evening back. |
this seems to be building for me, thanks |
WHL for flash-attn on Windowes here. Needs CUDA12 |
Hey, i tried to run it on windows but flash-attn is not installing on windows.
From what I found in different github repos is, that flash-attn needs a cuda version 12.0 or higher to run on windows systems.
Maybe in the near future, you can update your script to run on a higher cuda version, so we can test it on a windows system too =)
The text was updated successfully, but these errors were encountered: