Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to load convnext model using local bin file #7

Open
yuecao0119 opened this issue Mar 13, 2024 · 7 comments
Open

How to load convnext model using local bin file #7

yuecao0119 opened this issue Mar 13, 2024 · 7 comments

Comments

@yuecao0119
Copy link

Hello, thanks for your work.

But I would like to ask if I want the convnext model to use local files instead of downloading them every time. How should this be done? I tried modifying the code, but to no avail.

Could you please give me some guidance? Thank you very much.

@aoji0606
Copy link

I have the same problem

1 similar comment
@gapjialin
Copy link

I have the same problem

@luogen1996
Copy link
Owner

Download the weights from here, then change the name --vision_tower_slow convnext_large_mlp.clip_laion2b_ft_320 to your download path, e.g., ./weights/convnext_large_mlp.clip_laion2b_ft_320 .

@aoji0606
Copy link

I did it exactly the way you said, but it didn't work
image
image

@luogen1996
Copy link
Owner

This is because your network cannot connect to the huggingface server. As far as I know, even loading weights locally may still require access to the huggingface server. I recommend you to use a VPN such as pigcha.

@chuangzhidan
Copy link

clip_laion2b_ft_320

which model/weights should we download exactly?because there are so many choices.

@FanshuoZeng
Copy link

我完全按照你说的做了,但没有效果 图像 图像

Did you solve this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants