-
Notifications
You must be signed in to change notification settings - Fork 770
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ModuleNotFoundError: No module named 'tokenizers.tokenizers' #1619
Comments
Hey are you sure you have |
Hi, I also have this problem. When I try: I get the following error:
|
It looks like Transformers 4.44.2 installs tokenizers 0.19.0 which does not work with Python 3.9.19. When I loaded tokenizers 0.19.1 it worked. Tokenizers 0.20 is not compatible.
From: Mohamed JALANJI ***@***.***>
Reply-To: huggingface/tokenizers ***@***.***>
Date: Monday, August 26, 2024 at 2:20 PM
To: huggingface/tokenizers ***@***.***>
Cc: Jeffrey Page Ferraro ***@***.***>, Author ***@***.***>
Subject: Re: [huggingface/tokenizers] ModuleNotFoundError: No module named 'tokenizers.tokenizers' (Issue #1619)
Hi, I also have this problem.
When I try:
from transformers import RobertaTokenizer
I get the following error:
ModuleNotFoundError Traceback (most recent call last)
File d:\Apps\envs\cuda11\Lib\site-packages\transformers\utils\import_utils.py:1535, in _LazyModule._get_module(self, module_name)
1534 try:
-> 1535 return importlib.import_module("." + module_name, self.__name__)
1536 except Exception as e:
File d:\Apps\envs\cuda11\Lib\importlib\__init__.py:126, in import_module(name, package)
125 level += 1
…--> 126 return _bootstrap._gcd_import(name[level:], package, level)
File <frozen importlib._bootstrap>:1204, in _gcd_import(name, package, level)
File <frozen importlib._bootstrap>:1176, in _find_and_load(name, import_)
File <frozen importlib._bootstrap>:1126, in _find_and_load_unlocked(name, import_)
File <frozen importlib._bootstrap>:241, in _call_with_frames_removed(f, *args, **kwds)
File <frozen importlib._bootstrap>:1204, in _gcd_import(name, package, level)
File <frozen importlib._bootstrap>:1176, in _find_and_load(name, import_)
File <frozen importlib._bootstrap>:1147, in _find_and_load_unlocked(name, import_)
...
1539 f" traceback):\n{e}"
1540 ) from e
RuntimeError: Failed to import transformers.models.roberta because of the following error (look up to see its traceback):
No module named 'tokenizers.metrics'
—
Reply to this email directly, view it on GitHub<#1619 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/A76SFUD42AU4WWF5D7E56IDZTOEIPAVCNFSM6AAAAABNC3GKCGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGMJRGAYDMNRZGY>.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
It looks like Transformers 4.44.2 installs tokenizers 0.19.0 which does not work with Python 3.9.19. When I loaded tokenizers 0.19.1 it worked. Tokenizers 0.20 is not compatible. |
But when I install Transformers I get the version |
@mohjalanji yes |
Is Transformers 4.44.0 compatible with Python 3.9.19? I receive the following error below - any thoughts are appreciated.
import transformers
from transformers import AutoTokenizer
Produces the following error:
ModuleNotFoundError Traceback (most recent call last)
D:\Anaconda3\lib\site-packages\transformers\utils\import_utils.py in _get_module(self, module_name)
1602 try:
-> 1603 return importlib.import_module("." + module_name, self.name)
1604 except Exception as e:
D:\Anaconda3\lib\importlib_init_.py in import_module(name, package)
126 level += 1
--> 127 return _bootstrap._gcd_import(name[level:], package, level)
128
D:\Anaconda3\lib\importlib_bootstrap.py in _gcd_import(name, package, level)
D:\Anaconda3\lib\importlib_bootstrap.py in find_and_load(name, import)
D:\Anaconda3\lib\importlib_bootstrap.py in find_and_load_unlocked(name, import)
D:\Anaconda3\lib\importlib_bootstrap.py in _load_unlocked(spec)
D:\Anaconda3\lib\importlib_bootstrap_external.py in exec_module(self, module)
D:\Anaconda3\lib\importlib_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)
D:\Anaconda3\lib\site-packages\transformers\integrations\ggml.py in
23 import numpy as np
---> 24 from tokenizers import Tokenizer, decoders, normalizers, pre_tokenizers
25 from tokenizers.models import BPE
D:\Anaconda3\lib\site-packages\tokenizers_init_.py in
77
---> 78 from .tokenizers import (
79 AddedToken,
ModuleNotFoundError: No module named 'tokenizers.tokenizers'
The above exception was the direct cause of the following exception:
RuntimeError Traceback (most recent call last)
D:\Anaconda3\lib\site-packages\transformers\utils\import_utils.py in _get_module(self, module_name)
1602 try:
-> 1603 return importlib.import_module("." + module_name, self.name)
1604 except Exception as e:
D:\Anaconda3\lib\importlib_init_.py in import_module(name, package)
126 level += 1
--> 127 return _bootstrap._gcd_import(name[level:], package, level)
128
D:\Anaconda3\lib\importlib_bootstrap.py in _gcd_import(name, package, level)
D:\Anaconda3\lib\importlib_bootstrap.py in find_and_load(name, import)
D:\Anaconda3\lib\importlib_bootstrap.py in find_and_load_unlocked(name, import)
D:\Anaconda3\lib\importlib_bootstrap.py in _load_unlocked(spec)
D:\Anaconda3\lib\importlib_bootstrap_external.py in exec_module(self, module)
D:\Anaconda3\lib\importlib_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)
D:\Anaconda3\lib\site-packages\transformers\models\auto\tokenization_auto.py in
23
---> 24 from ...configuration_utils import PretrainedConfig
25 from ...dynamic_module_utils import get_class_from_dynamic_module, resolve_trust_remote_code
D:\Anaconda3\lib\site-packages\transformers\configuration_utils.py in
28 from .dynamic_module_utils import custom_object_save
---> 29 from .modeling_gguf_pytorch_utils import load_gguf_checkpoint
30 from .utils import (
D:\Anaconda3\lib\site-packages\transformers\modeling_gguf_pytorch_utils.py in
21
---> 22 from .integrations import (
23 GGUF_CONFIG_MAPPING,
D:\Anaconda3\lib\importlib_bootstrap.py in handle_fromlist(module, fromlist, import, recursive)
D:\Anaconda3\lib\site-packages\transformers\utils\import_utils.py in getattr(self, name)
1592 elif name in self._class_to_module.keys():
-> 1593 module = self._get_module(self._class_to_module[name])
1594 value = getattr(module, name)
D:\Anaconda3\lib\site-packages\transformers\utils\import_utils.py in _get_module(self, module_name)
1604 except Exception as e:
-> 1605 raise RuntimeError(
1606 f"Failed to import {self.name}.{module_name} because of the following error (look up to see its"
RuntimeError: Failed to import transformers.integrations.ggml because of the following error (look up to see its traceback):
No module named 'tokenizers.tokenizers'
The above exception was the direct cause of the following exception:
RuntimeError Traceback (most recent call last)
~\AppData\Local\Temp\ipykernel_4232\77889138.py in
----> 1 from transformers import AutoTokenizer
D:\Anaconda3\lib\importlib_bootstrap.py in handle_fromlist(module, fromlist, import, recursive)
D:\Anaconda3\lib\site-packages\transformers\utils\import_utils.py in getattr(self, name)
1592 elif name in self._class_to_module.keys():
1593 module = self._get_module(self._class_to_module[name])
-> 1594 value = getattr(module, name)
1595 else:
1596 raise AttributeError(f"module {self.name} has no attribute {name}")
D:\Anaconda3\lib\site-packages\transformers\utils\import_utils.py in getattr(self, name)
1591 value = self._get_module(name)
1592 elif name in self._class_to_module.keys():
-> 1593 module = self._get_module(self._class_to_module[name])
1594 value = getattr(module, name)
1595 else:
D:\Anaconda3\lib\site-packages\transformers\utils\import_utils.py in _get_module(self, module_name)
1603 return importlib.import_module("." + module_name, self.name)
1604 except Exception as e:
-> 1605 raise RuntimeError(
1606 f"Failed to import {self.name}.{module_name} because of the following error (look up to see its"
1607 f" traceback):\n{e}"
RuntimeError: Failed to import transformers.models.auto.tokenization_auto because of the following error (look up to see its traceback):
Failed to import transformers.integrations.ggml because of the following error (look up to see its traceback):
No module named 'tokenizers.tokenizers'
The text was updated successfully, but these errors were encountered: