Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Your demo code on HuggingFace is throwing 502 Gateway error #49

Open
vbayanag opened this issue Jun 26, 2024 · 7 comments
Open

Your demo code on HuggingFace is throwing 502 Gateway error #49

vbayanag opened this issue Jun 26, 2024 · 7 comments

Comments

@vbayanag
Copy link

Please provide instruction son how to evaluate AlphaCLIP MLLM model.

@SunzeY
Copy link
Owner

SunzeY commented Jun 30, 2024

the openxlab resource is limited, the source code is available in demo, you can run it locally.

@vbayanag
Copy link
Author

vbayanag commented Jul 11, 2024

Hey SunzeY,
Your demo with_llm is throwing the following error. I'm running the code on rocm/pytorch :rocm6.1.2_ubuntu20.04_py3.9_pytorch_release-2.1.2 docker image on AMD GPU

ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' (/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bloom/modeling_bloom.py)

@SunzeY
Copy link
Owner

SunzeY commented Jul 12, 2024

maybe the torchvision version is not matched? You should make sure to be able to run llava original codes first in your environment before testing alpha-clip. any environmental problem can be better solved in LLaVA repo.

@vbayanag
Copy link
Author

Also, earlier I got an error at https://github.com/SunzeY/AlphaCLIP/blob/main/demo/with_llm/llava/model/language_model/llava_llama.py#L139

Llava is already used by transformers model.

Is it expected that I change the registration name on L#139 when I run the demo?

@SunzeY
Copy link
Owner

SunzeY commented Jul 12, 2024

you can change transformers version to 4.36.2, as newer package already include llava in official model. If nothing went wrong after changing the name, it's also fine.

@vbayanag
Copy link
Author

After changing the name I got the below error on transformers 4.42+
ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' (/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bloom/modeling_bloom.py)

Let me try with a different version now. Did you happen to get a chance to run a demo on AMD hardware? Otherwise, I'd be happy to help.

@SunzeY
Copy link
Owner

SunzeY commented Jul 19, 2024

nop

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants