Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

import pytorch_lightning fails with ValueError: transformers.models.auto.__spec__ is None #11524

Closed
nonstopfor opened this issue Jan 18, 2022 · 4 comments · Fixed by #11526
Closed
Labels
bug Something isn't working

Comments

@nonstopfor
Copy link

🐛 Bug

A ValueError occurs whenever PyTorch Lightning is imported after Huggingface's Transformer. Importing Transformer after PyTorch does not lead to any errors, however.

To Reproduce

Tested in local anaconda environment:
import this:

from transformers import T5ForConditionalGeneration, T5Tokenizer
import pytorch_lightning

It seems that this issue doesn't exist on Google Colab.

Expected behavior

ValueError: transformers.models.auto.__spec__ is None

Environment

  • CUDA:
    - GPU:
    - TITAN X (Pascal)
    - TITAN X (Pascal)
    - TITAN X (Pascal)
    - TITAN X (Pascal)
    - available: True
    - version: 10.1
  • Packages:
    - numpy: 1.22.1
    - pyTorch_debug: False
    - pyTorch_version: 1.8.1+cu101
    - pytorch-lightning: 1.5.8
    - tqdm: 4.61.2
    - transformers: 4.15.0
  • System:
    - OS: Linux
    - architecture:
    - 64bit
    - ELF
    - processor: x86_64
    - python: 3.9.5
    - version: Rename variables #124-Ubuntu SMP Fri Nov 10 18:29:59 UTC 2017

Additional context

@nonstopfor nonstopfor added the bug Something isn't working label Jan 18, 2022
@akihironitta
Copy link
Contributor

akihironitta commented Jan 18, 2022

It seems to happen with torchmetrics==0.7.0 which was released yesterday as reported in Lightning-AI/torchmetrics#770.


A quick workaround is to not use torchmetrics==0.7.0.

pip install "torchmetrics<0.7"

@nonstopfor
Copy link
Author

Thanks! It is solved now.

@Borda Borda mentioned this issue Jan 18, 2022
12 tasks
@akihironitta
Copy link
Contributor

Let's keep this issue open until it gets fixed :)

@akihironitta akihironitta reopened this Jan 18, 2022
rosbo added a commit to Kaggle/docker-python that referenced this issue Jan 18, 2022
`torchmetrics` 0.7.0 released yesterday (Jan 17th 2022) is causing an issue with pytorch_lightning: Lightning-AI/pytorch-lightning#11524
rosbo added a commit to Kaggle/docker-python that referenced this issue Jan 18, 2022
`torchmetrics` 0.7.0 released yesterday (Jan 17th 2022) is causing an issue with pytorch_lightning: Lightning-AI/pytorch-lightning#11524
@akihironitta akihironitta changed the title Importing PyTorch-Lightning after Transformer leads to a ValueError import pytorch_lightning fails with ValueError: transformers.models.auto.__spec__ is None Jan 20, 2022
@BenoitDalFerro
Copy link

exact same issue exact same diagnostic with install as of yesterday 22/01/2022 20h51 GMT+1
torchmetrics 0.7.0 pyhd8ed1ab_0 conda-forge
downgrading to torchmetric<0.7.0 as a quick fix did the trick

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants