-
-
Notifications
You must be signed in to change notification settings - Fork 11.1k
Open
Labels
new-modelRequests to new modelsRequests to new modelsstaleOver 90 days of inactivityOver 90 days of inactivity
Description
Your current environment
The output of python collect_env.py
Your output of `python collect_env.py` here
🐛 Describe the bug
Hi, when i run the vllm serve to launch jina-embeddings-v4 model, where my deploy script is :
export VLLM_USE_V1=0 MODEL_PATH="./chkpt/jina-embeddings-v4/" vllm serve $MODEL_PATH --served-model-name model --trust-remote-code --task embed
I met the follow error:
ValueError: JinaEmbeddingsV4Model has no vLLM implementation and the Transformers implementation is not compatible with vLLM. Try setting VLLM_USE_V1=0.
looking forward to your helpful reply
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
Metadata
Metadata
Assignees
Labels
new-modelRequests to new modelsRequests to new modelsstaleOver 90 days of inactivityOver 90 days of inactivity
Type
Projects
Status
In Progress