-
Notifications
You must be signed in to change notification settings - Fork 817
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Runtime Error in Python 3.8 #1691
Comments
Same for us. |
same for me my docker env 43.94 Downloading tokenizers-0.20.4.tar.gz (343 kB) |
Problem only occurs in version 3.8 test |
Yep, I'll try to push the 3.8 version, this is kind of a breaking change that comes from the recent ABI update that we needed. Sorry all! |
Abi wheels were introduced in #1674, to reduce the crazy amount of wheels we had |
We are gonna yank the release, to have 0.21.0 instead |
Closing as the release is yanked! |
Added llama benchmark - training and inference in accordance with the existing pytorch models implementation like gpt2, lstm etc. - added llama fp8 unit test for better code coverage, to reduce memory required - updated transformers version >= 4.28.0 for LLamaConfig - set tokenizers version <= 0.20.3 to avoid 0.20.4 version [issues](huggingface/tokenizers#1691) with py3.8 - added llama2 to tensorrt - llama2 tests not added to test_tensorrt_inference_performance.py due to large memory requirement for worker gpu. tests validated separately on gh200 --------- Co-authored-by: dpatlolla <dpatlolla@microsoft.com>
My Github Action pipeline failed on importing Transformers after upgrading from tokenizers 0.20.3 to 0.20.4. Specifically early on this line in my pytest code:
from transformers import AutoModel, AutoTokenizer
It passed for python 3.9, 3.10, 3.11, but failed specifically for the Python 3.8. This is the error:
Looking at the release logs there was a change to abi3: https://github.com/huggingface/tokenizers/releases/tag/v0.20.4. I'm assuming it's related.
The text was updated successfully, but these errors were encountered: