Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug/Model Request]: I cannot download any model #349

Open
eduardfossas opened this issue Sep 23, 2024 · 7 comments
Open

[Bug/Model Request]: I cannot download any model #349

eduardfossas opened this issue Sep 23, 2024 · 7 comments

Comments

@eduardfossas
Copy link

What happened?

I have been working the past month with Qdrant and all was going great. Suddenly, the past week, when I tried to download a model, like the base model BAAI/bge-small-en when adding elements to a collection I receive the next message:

2024-09-23 20:02:31.360 | ERROR | fastembed.common.model_management:download_model:256 - Could not download model from url: https://storage.googleapis.com/qdrant-fastembed/BAAI-bge-small-en.tar.gz

I am using fastembed 3.6.0

What Python version are you on? e.g. python --version

<3.13,>=3.8.0

Version

0.2.7 (Latest)

What os are you seeing the problem on?

Windows

Relevant stack traces and/or logs

No response

@joein
Copy link
Member

joein commented Sep 24, 2024

Hey @eduardfossas, have you tried other models?
Some of the models are downloaded from our gcp storage, while the others are downloaded from HF Hub
Could you please try downloading other models?
E.g. BAAI/bge-small-en-v1.5

@eduardfossas
Copy link
Author

Hmm, that seemed to work, I think the problem is gcp storage. Thanks!

@joein
Copy link
Member

joein commented Sep 24, 2024

Is there any other traceback available for the models which are failing for you?
Have you maybe changed some of your firewall rules or made any other network related updates?

@eduardfossas
Copy link
Author

Hello @joein. Yes, the only thing I did was delete the cache folder in temp. Maybe that was the issue too.

@aniketwattamwar
Copy link

Hey,
How to resolve this issue? It was all working fine but now when I try I get the same error - ValueError: Could not download model BAAI/bge-small-en from any source.

@eduardfossas
Copy link
Author

Hello, @aniketwattamwar. That's a good question. I think I did this on top of my python file and it solved it for me:

model = TextEmbedding(model_name="BAAI/bge-small-en-v1.5")

@aniketwattamwar
Copy link

Thank you @eduardfossas
That worked for me too. A bit weird when it stopped working suddenly.
client.set_model("BAAI/bge-small-en-v1.5")

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants