We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transformers v4.22 removed the cached_path method with huggingface/transformers#18497.
cached_path
This is resulting in the following error when running with transformers==4.22.0
ImportError: cannot import name 'cached_path' from 'transformers.file_utils'
The translation pipeline needs to be updated with an equivalent way to run cached_path like functionality for files not on the Hugging Face Hub.
The workaround for txtai < 5.0 is to pin transformers to <= 4.21.3.
The text was updated successfully, but these errors were encountered:
This issue also addresses a UserWarning in the translation pipeline regarding setting max_length in the model.generate call.
model.generate
Sorry, something went wrong.
33bc426
davidmezzetti
No branches or pull requests
Transformers v4.22 removed the
cached_path
method with huggingface/transformers#18497.This is resulting in the following error when running with transformers==4.22.0
The translation pipeline needs to be updated with an equivalent way to run
cached_path
like functionality for files not on the Hugging Face Hub.The workaround for txtai < 5.0 is to pin transformers to <= 4.21.3.
The text was updated successfully, but these errors were encountered: