You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I followed all instructions which this project provide, and I get this error:
ValueError: torch.cuda.is_available() should be True but is False. xformers' memory efficient attention is only available for GPU
After that I removed --enable_xformers_memory_efficient_attention argument from my command, the error changed like this:
...src/inference.py", line 226, in main
generator = torch.Generator("cuda").manual_seed(args.seed)
^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Device type CUDA is not supported for torch.Generator() api.
I've searched that error and then I've found pytorch mps document. I changed the src/inference.py:226 code to:
So maybe I've been trying something wrong, because it didn't work too. I was going to try without GPU but i didn't do it. Is there a way to close cuda/GPU?
Thank you
My environment:
Macbook Pro 14-inch, 2021
chip: Apple M1 Pro
os: 13.6 (22G120)
python: 3.11
The text was updated successfully, but these errors were encountered:
I followed all instructions which this project provide, and I get this error:
After that I removed
--enable_xformers_memory_efficient_attention
argument from my command, the error changed like this:I've searched that error and then I've found pytorch mps document. I changed the src/inference.py:226 code to:
So maybe I've been trying something wrong, because it didn't work too. I was going to try without GPU but i didn't do it. Is there a way to close cuda/GPU?
Thank you
My environment:
Macbook Pro 14-inch, 2021
chip: Apple M1 Pro
os: 13.6 (22G120)
python: 3.11
The text was updated successfully, but these errors were encountered: