Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stable Diffusion 2-1 Not Loaded on Colab with Minimal Loading #6446

Closed
maliozer opened this issue Jan 4, 2024 · 12 comments
Closed

Stable Diffusion 2-1 Not Loaded on Colab with Minimal Loading #6446

maliozer opened this issue Jan 4, 2024 · 12 comments
Labels
bug Something isn't working

Comments

@maliozer
Copy link

maliozer commented Jan 4, 2024

Describe the bug

Within this simple loading script:

from diffusers import DiffusionPipeline

pipeline = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-2-1")

The progress is stucked there is no error at all,
Fetching 13 files: 0% 0/13 [05:54<?, ?it/s]

Reproduction

Colab: https://colab.research.google.com/drive/1tzys4nVo95JAJgd_0MWp4s9Nm71wxQlZ?usp=sharing

Logs

Fetching 13 files: 0%
0/13 [05:54<?, ?it/s]
tokenizer/special_tokens_map.json: 100%
460/460 [00:00<00:00, 9.94kB/s]
tokenizer/tokenizer_config.json: 100%
824/824 [00:00<00:00, 50.3kB/s]

System Info

Colab

  • diffusers version: 0.25.0
  • Platform: Linux-6.1.58+-x86_64-with-glibc2.35
  • Python version: 3.10.12
  • PyTorch version (GPU?): 2.1.0+cu121 (False)
  • Huggingface_hub version: 0.20.1
  • Transformers version: 4.35.2
  • Accelerate version: not installed
  • xFormers version: not installed
  • Using GPU in script?:
  • Using distributed or parallel set-up in script?:

Who can help?

@yiyixuxu @DN6 @sayakpaul @patrickvonplaten

@maliozer maliozer added the bug Something isn't working label Jan 4, 2024
@sayakpaul
Copy link
Member

sayakpaul commented Jan 4, 2024

Hmm this seems to be a problem only on Colab. I tried it on our internal server and on my local machine. It wasn't a problem.

@sayakpaul sayakpaul reopened this Jan 4, 2024
@DN6
Copy link
Collaborator

DN6 commented Jan 4, 2024

The issue has also been raised on the Colab repo
googlecolab/colabtools#4272

@pranauv1
Copy link

pranauv1 commented Jan 4, 2024

It's very weird, but running it on a .py file seems to be working.

Download the model through a Python file

from diffusers import DiffusionPipeline
import torch

DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-xl-base-1.0", torch_dtype=torch.float16, use_safetensors=True, variant="fp16")

Save the above as a .py file and run it on Colab, the model will be cached!

Now call the pipeline
This won't attempt to download from HF since the model is already cached locally!

pipeline = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-xl-base-1.0", torch_dtype=torch.float16, use_safetensors=True, variant="fp16")

@ritwikraha
Copy link

I don't know if this is practical for everyone but running it on kaggle-notebooks seems to be working

https://www.kaggle.com/ritzraha/diffusion-template

So it is highly probable that this is a colab issue, FYI I am still having the same problem (as documented above) on colab

@imanphung
Copy link

imanphung commented Jan 4, 2024

pipe = StableDiffusionControlNetPipeline.from_pretrained( "SG161222/Realistic_Vision_V5.1_noVAE", controlnet=controlnet, torch_dtype=torch.float16, scheduler=noise_scheduler, vae=vae, token="", ).to("cuda")

I added the huggingface token inside then it works fine!
You can try it

@maliozer
Copy link
Author

maliozer commented Jan 4, 2024

It may await to read hf_token from colab but never access

@KevinJBoyer
Copy link

@imanphung thank you! Confirming that adding token="" to the .from_pretrained call worked for me too.

@maliozer
Copy link
Author

maliozer commented Jan 4, 2024

@imanphung thank you! Confirming that adding token="" to the .from_pretrained call worked for me too.

This is not worked for me, still need a valid solution

@math-rb
Copy link

math-rb commented Jan 4, 2024

It's very weird, but running it on a .py file seems to be working.

Download the model through a Python file

from diffusers import DiffusionPipeline
import torch

DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-xl-base-1.0", torch_dtype=torch.float16, use_safetensors=True, variant="fp16")

Save the above as a .py file and run it on Colab, the model will be cached!

Now call the pipeline This won't attempt to download from HF since the model is already cached locally!

pipeline = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-xl-base-1.0", torch_dtype=torch.float16, use_safetensors=True, variant="fp16")

Worked here.

@sayakpaul
Copy link
Member

We're on top of it: huggingface/huggingface_hub#1953. Cc: @Wauplin

@Wauplin
Copy link
Collaborator

Wauplin commented Jan 5, 2024

Thanks everyone for reporting and sorry for the inconvenience. I have been able to reproduce the issue and created a PR (huggingface/huggingface_hub#1953) to fix it. The plan is to make a hot-fix release once it's merged and make it the default in google colab. In the meantime, a temporary solution is to install a different version of huggingface_hub before executing any code. More details in huggingface/huggingface_hub#1953 (comment).

@Wauplin
Copy link
Collaborator

Wauplin commented Jan 12, 2024

Update: huggingface_hub 0.20.2 is now the default version on Google Colab, meaning this issue should be fixed for everyone without having to upgrade (see googlecolab/colabtools#4272 (comment)).

@Wauplin Wauplin closed this as completed Jan 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

9 participants