Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

In a local JupyterLab that is not a Google Colab environment, _get_token_from_google_colab freezes and stops responding #1944

Closed
tanaga9 opened this issue Dec 29, 2023 · 6 comments
Labels
bug Something isn't working

Comments

@tanaga9
Copy link

tanaga9 commented Dec 29, 2023

Describe the bug

In a local JupyterLab that is not a Google Colab environment, _get_token_from_google_colab freezes and stops responding.

This issue was temporarily resolved by the following code.

import huggingface_hub.utils._token
huggingface_hub.utils._token._get_token_from_google_colab = lambda: False

Reproduction

  • In a local JupyterLab that is not a Google Colab environment
  • call StableDiffusionControlNetImg2ImgPipeline.from_single_file

I will attach the traceback of an interrupt during a freeze to the logs.

Logs

---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-7-30de77117569> in <cell line: 1>()
----> 1 pipe = StableDiffusionControlNetImg2ImgPipeline.from_single_file(
      2     #"https://huggingface.co/OedoSoldier/ambientmix",
      3     "ambientmix-half.safetensors",
      4       # "ambientmixAnAnime_v10.safetensors",
      5       #"../stable-diffusion-webui/models/Stable-diffusion/ambientmixAnAnime_v10.safetensors",

/usr/local/lib/python3.10/dist-packages/diffusers/loaders.py in from_single_file(cls, pretrained_model_link_or_path, **kwargs)
   1920             )
   1921 
-> 1922         pipe = download_from_original_stable_diffusion_ckpt(
   1923             pretrained_model_link_or_path,
   1924             pipeline_class=cls,

/usr/local/lib/python3.10/dist-packages/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py in download_from_original_stable_diffusion_ckpt(checkpoint_path, original_config_file, image_size, prediction_type, model_type, extract_ema, scheduler_type, num_in_channels, upcast_attention, device, from_safetensors, stable_unclip, stable_unclip_prior, clip_stats_path, controlnet, load_safety_checker, pipeline_class, local_files_only, vae_path, vae, text_encoder, tokenizer, config_files)
   1532         )
   1533     elif model_type == "FrozenCLIPEmbedder":
-> 1534         text_model = convert_ldm_clip_checkpoint(
   1535             checkpoint, local_files_only=local_files_only, text_encoder=text_encoder
   1536         )

/usr/local/lib/python3.10/dist-packages/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py in convert_ldm_clip_checkpoint(checkpoint, local_files_only, text_encoder)
    779     if text_encoder is None:
    780         config_name = "openai/clip-vit-large-patch14"
--> 781         config = CLIPTextConfig.from_pretrained(config_name, local_files_only=local_files_only)
    782 
    783         ctx = init_empty_weights if is_accelerate_available() else nullcontext

/usr/local/lib/python3.10/dist-packages/transformers/models/clip/configuration_clip.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
    128     @classmethod
    129     def from_pretrained(cls, pretrained_model_name_or_path: Union[str, os.PathLike], **kwargs) -> "PretrainedConfig":
--> 130         config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
    131 
    132         # get the text config dict if we are loading from CLIPConfig

/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
    572         original_kwargs = copy.deepcopy(kwargs)
    573         # Get config dict associated with the base config file
--> 574         config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
    575         if "_commit_hash" in config_dict:
    576             original_kwargs["_commit_hash"] = config_dict["_commit_hash"]

/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py in _get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
    627             try:
    628                 # Load from local folder or from cache or download from model Hub and cache
--> 629                 resolved_config_file = cached_file(
    630                     pretrained_model_name_or_path,
    631                     configuration_file,

/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, use_auth_token, revision, local_files_only, subfolder, repo_type, user_agent, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash)
    415     try:
    416         # Load from URL or cache if already cached
--> 417         resolved_file = hf_hub_download(
    418             path_or_repo_id,
    419             filename,

/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py in _inner_fn(*args, **kwargs)
    116             kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.__name__, has_token=has_token, kwargs=kwargs)
    117 
--> 118         return fn(*args, **kwargs)
    119 
    120     return _inner_fn  # type: ignore

/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py in hf_hub_download(repo_id, filename, subfolder, repo_type, revision, library_name, library_version, cache_dir, local_dir, local_dir_use_symlinks, user_agent, force_download, force_filename, proxies, etag_timeout, resume_download, token, local_files_only, legacy_cache_layout, endpoint)
   1221     url = hf_hub_url(repo_id, filename, repo_type=repo_type, revision=revision, endpoint=endpoint)
   1222 
-> 1223     headers = build_hf_headers(
   1224         token=token,
   1225         library_name=library_name,

/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py in _inner_fn(*args, **kwargs)
    116             kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.__name__, has_token=has_token, kwargs=kwargs)
    117 
--> 118         return fn(*args, **kwargs)
    119 
    120     return _inner_fn  # type: ignore

/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_headers.py in build_hf_headers(token, is_write_action, library_name, library_version, user_agent)
    119     """
    120     # Get auth token to send
--> 121     token_to_send = get_token_to_send(token)
    122     _validate_token_to_send(token_to_send, is_write_action=is_write_action)
    123 

/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_headers.py in get_token_to_send(token)
    146 
    147     # Token is not provided: we get it from local cache
--> 148     cached_token = get_token()
    149 
    150     # Case token is explicitly required

/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_token.py in get_token()
     39         `str` or `None`: The token, `None` if it doesn't exist.
     40     """
---> 41     return _get_token_from_google_colab() or _get_token_from_environment() or _get_token_from_file()
     42 
     43 

/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_token.py in _get_token_from_google_colab()
     57 
     58     try:
---> 59         token = userdata.get("HF_TOKEN")
     60     except userdata.NotebookAccessError:
     61         # Means the user has a secret call `HF_TOKEN` and got a popup "please grand access to HF_TOKEN" and refused it

/usr/local/lib/python3.10/dist-packages/google/colab/userdata.py in get(key)
     33     SecretNotFoundError: If the requested secret is not found.
     34   """
---> 35   resp = _message.blocking_request(
     36       'GetSecret', request={'key': key}, timeout_sec=None
     37   )

/usr/local/lib/python3.10/dist-packages/google/colab/_message.py in blocking_request(request_type, request, timeout_sec, parent)
    174       request_type, request, parent=parent, expect_reply=True
    175   )
--> 176   return read_reply_from_input(request_id, timeout_sec)

/usr/local/lib/python3.10/dist-packages/google/colab/_message.py in read_reply_from_input(message_id, timeout_sec)
     94     reply = _read_next_input_message()
     95     if reply == _NOT_READY or not isinstance(reply, dict):
---> 96       time.sleep(0.025)
     97       continue
     98     if (

KeyboardInterrupt:

System info

- huggingface_hub version: 0.20.1
- Platform: Linux-6.2.0-39-generic-x86_64-with-glibc2.35
- Python version: 3.10.12
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Token path ?: /home/tanaga/.cache/huggingface/token
- Has saved token ?: True
- Who am I ?: tanaga9
- Configured git credential helpers: store
- FastAI: 2.7.13
- Tensorflow: 2.10.1
- Torch: 2.1.0+cu118
- Jinja2: 3.1.2
- Graphviz: 0.20.1
- Pydot: 1.4.2
- Pillow: 9.4.0
- hf_transfer: N/A
- gradio: 4.12.0
- tensorboard: N/A
- numpy: 1.23.5
- pydantic: 2.5.3
- aiohttp: 3.8.6
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: /home/tanaga/.cache/huggingface/hub
- HF_ASSETS_CACHE: /home/tanaga/.cache/huggingface/assets
- HF_TOKEN_PATH: /home/tanaga/.cache/huggingface/token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
@tanaga9 tanaga9 added the bug Something isn't working label Dec 29, 2023
@Wauplin
Copy link
Contributor

Wauplin commented Jan 3, 2024

Hi @tanaga9, sorry you're facing this issue. Thanks for reporting it. Could you tell me more about the environment you're using? I'm asking since if you reach line ---> 59 token = userdata.get("HF_TOKEN"), it means that:

  1. huggingface_hub detects that you are running in a google colab here
    • => meaning that str(get_ipython()) contains "google.colab" as a string (see here)
  2. that google.colab.userdata is installed locally (see here)
  3. but that google.colab.userdata.get("HF_TOKEN") doesn't work.
    • => which looks more like an issue with google.colab package itself

What's also weird is that when you run huggingface-cli env it looks like it doesn't detect you as running in a colab (see in the dump you copy-pasted: - Running in Google Colab ?: No). I'm in favor of having a fix in huggingface_hub to prevent this if that's possible but we would need to understand better what's going on here. Thanks in advance.

@tanaga9
Copy link
Author

tanaga9 commented Jan 4, 2024

hi @Wauplin , Thank you for your reply.
I am using the Google Colab image on Docker on Ubuntu.
image name is asia-docker.pkg.dev/colab-images/public/runtime.
https://console.cloud.google.com/artifacts/docker/colab-images/asia/public/runtime

By setting the FROM to that image and the CMD to jupyter lab ..., I can use the same instance for both local JupyterLab and as a local runtime from Colab.

I tried to create separate images, but it was too much work, so I unified them.

In this state, get_ipython() returned google.colab._shell.Shell, even when using it as a local JupyterLab (without using the Colab interface).

sc_20240105_015735

@Wauplin
Copy link
Contributor

Wauplin commented Jan 5, 2024

Hi @tanaga9, thanks for the context. I wasn't aware of this runtime.

I have been able to reproduce the issue and created PR #1953 to fix it. The plan is to make a hot-fix release once it's merged and make it the default in google colab. In the meantime, a temporary solution is to install a different version of huggingface_hub before executing any code. More details in #1953 (comment).

@tanaga9
Copy link
Author

tanaga9 commented Jan 5, 2024

thanks

There is no problem as there is already a workaround.
As a precaution, I have attached the link to the document I saw.

@Wauplin
Copy link
Contributor

Wauplin commented Jan 5, 2024

#1953 is merged and the fix has been released (0.20.2). I'm closing this issue.

@Wauplin Wauplin closed this as completed Jan 5, 2024
@Wauplin
Copy link
Contributor

Wauplin commented Jan 12, 2024

Update: huggingface_hub 0.20.2 is now the default version on Google Colab, meaning this issue should be fixed for everyone without having to upgrade (see googlecolab/colabtools#4272 (comment)).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants