-
Notifications
You must be signed in to change notification settings - Fork 458
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request]: 工具调用可以支持openai 的api function call吗 #239
Comments
可以的,教程里给出了如何使用vllm的api进行function call https://modelbest.feishu.cn/wiki/ARJtwko3gisbw5kdPiDcDIOvnGg?from=from_copylink |
但这例子不是用的transformers的 function call吗?不是opeanai的接口,opeanai的接口是不需要再单独加载tokenizer的 |
你好,tokenizer仅仅是在做数据处理,如果你不想加载tokenizer就按照tokenizer的方式进行数据处理,再调用api即可 |
|
看现在qwen2.5 用vllm推理后支持https://qwenlm.github.io/zh/blog/qwen2.5/ |
可以使用https://github.com/OpenBMB/MiniCPM/blob/main/demo/minicpm3/function_call/openai_api_server.py 来启动一个支持function call的vllm server |
有运行,但这边使用案例还额外需要加载transformers tokenizer去调用function的 |
Feature request / 功能建议
看现在支持transformers的 function call
The text was updated successfully, but these errors were encountered: