Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can i run this project with m1 mac #42

Open
abdurrahmanekr opened this issue Sep 27, 2023 · 0 comments
Open

How can i run this project with m1 mac #42

abdurrahmanekr opened this issue Sep 27, 2023 · 0 comments

Comments

@abdurrahmanekr
Copy link

I followed all instructions which this project provide, and I get this error:

ValueError: torch.cuda.is_available() should be True but is False. xformers' memory efficient attention is only available for GPU

After that I removed --enable_xformers_memory_efficient_attention argument from my command, the error changed like this:

...src/inference.py", line 226, in main
    generator = torch.Generator("cuda").manual_seed(args.seed)
                ^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Device type CUDA is not supported for torch.Generator() api.

I've searched that error and then I've found pytorch mps document. I changed the src/inference.py:226 code to:

generator = torch.Generator("mps").manual_seed(args.seed)

So maybe I've been trying something wrong, because it didn't work too. I was going to try without GPU but i didn't do it. Is there a way to close cuda/GPU?

Thank you

My environment:
Macbook Pro 14-inch, 2021
chip: Apple M1 Pro
os: 13.6 (22G120)
python: 3.11

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant