Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-GPU usage support? #3

Open
neutrinotek opened this issue Jan 11, 2024 · 1 comment
Open

Multi-GPU usage support? #3

neutrinotek opened this issue Jan 11, 2024 · 1 comment

Comments

@neutrinotek
Copy link

neutrinotek commented Jan 11, 2024

The server I'm running this on has 2 GPUs, but only seems to be utilizing one of them. Since they are older GPUs (GTX1070), they only have 8gb of vram, so using the GPU enabled version of this model results in an out of memory error. Is there currently any way to implement multi-gpu support in this app?

@bigcat88
Copy link
Contributor

bigcat88 commented Jan 11, 2024

First we need diffusers to support it: huggingface/diffusers#6240

After that we can remove this:

if torch.cuda.is_available():
PIPE.to("cuda")
elif torch.backends.mps.is_available():
PIPE.to("mps")

and just do

pipe = pipeline(...., device_map="auto")

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants