-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
get_language_model_class does not work when huggingface.co is offline, even if model is locally cached #2118
Comments
Hi @yunyu ! I agree that you should be able to use a local model if it is available without connecting to huggingface. Internally, we are wrapping the sentence_transformers library when a sentence-transformers model is used (like in your case). The corresponding code is here:
Could you maybe try to store a sentence transformers model locally and provide the file path instead of the model name during initialization? I'd be interested to see if that helps already. |
Hi, I just happened to bump into this issue. Have you tried setting TRANSFORMERS_OFFLINE=1 in the environment? https://github.com/huggingface/transformers/blob/8ce133063120683018b214fe10d1449e4c2401da/src/transformers/file_utils.py#L328 |
Hi @yunyu, did the above suggestions work for you? |
@yunu: as this issue seems to have an appropriate solution I'm closing this now. You can reopen it at any time if the solution does not work for you. |
my closing thoughts here are that this is really a poor choice on huggingface's part, revealing to every user's network that they access a model every time it is loaded. it would seem more normal to rely on the cache until a timeout expires, and to fail gracefully if the network is offline. this can be done by wrapping code, too. |
Describe the bug
Huggingface.co is currently offline. I am running Haystack with the models cached in ~/.cache/huggingface, but my program still fails.
Error message
Error that was thrown (if available)
Expected behavior
If huggingface.co is down and models are locally cached, the locally cached versions of the models should be used
Additional context
https://status.huggingface.co/
To Reproduce
Run
FAQ Check
System:
The text was updated successfully, but these errors were encountered: