-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Your demo code on HuggingFace is throwing 502 Gateway error #49
Comments
the openxlab resource is limited, the source code is available in |
Hey SunzeY,
|
maybe the torchvision version is not matched? You should make sure to be able to run llava original codes first in your environment before testing alpha-clip. any environmental problem can be better solved in LLaVA repo. |
Also, earlier I got an error at https://github.com/SunzeY/AlphaCLIP/blob/main/demo/with_llm/llava/model/language_model/llava_llama.py#L139
Is it expected that I change the registration name on L#139 when I run the demo? |
you can change transformers version to 4.36.2, as newer package already include llava in official model. If nothing went wrong after changing the name, it's also fine. |
After changing the name I got the below error on transformers 4.42+ Let me try with a different version now. Did you happen to get a chance to run a demo on AMD hardware? Otherwise, I'd be happy to help. |
nop |
Please provide instruction son how to evaluate AlphaCLIP MLLM model.
The text was updated successfully, but these errors were encountered: