[Question]: qwen2-7B 是否支持 tool_choice为必选项required #980
Replies: 3 comments 4 replies
-
Hi, please refer to Qwen-Agent if you need the model to use a specific tool. Follow the instructions there, and you should be able to pass |
Beta Was this translation helpful? Give feedback.
-
CUDA_VISIBLE_DEVICES=1 python -m vllm.entrypoints.openai.api_server --model ./Qwen2.5-7B-Instruct-AWQ --port 8009 --enable-auto-tool-choice --tool-call-parser hermes --max-seq-len-to-capture 2048 --gpu-memory-utilization 0.7 这个是我的启动方式, 我参考vllm中的vllm/entrypoints/openai/api_server.py 增加了这个参数; 但是报错了
|
Beta Was this translation helpful? Give feedback.
-
vLLM does not support Qwen2 function calling. Please open a separate issue for Qwen2.5. |
Beta Was this translation helpful? Give feedback.
-
Has this been raised before?
Description
qwen2-7B 是否支持 tool_choice如果为:required,
必须调用tools中的一个
"tools": [
{
"function": {}
}]
是否支持这种呢?这是让模型判断还是自己修改代码呢?
Beta Was this translation helpful? Give feedback.
All reactions