-
-
Notifications
You must be signed in to change notification settings - Fork 12k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] 暂时不支持chatglm的api调用 #1125
Comments
👀 @tyrantGXY Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
💻 System environmentWindows 📦 Deployment environmentOfficial Preview 🌐 BrowserChrome 🐛 Problem descriptionI use oneapi, and the channels in it are filled with the api and model of chatglm, but it cannot be used in lobechat. After using the custom model, an illegal parameter error is returned 🚦 Expected resultsHope to increase support 📷 Steps to reproduceNo response 📝 Supplementary informationNo response |
目前是还不支持 ChatGLM 的 api ,会在 #737 中支持 |
Currently, ChatGLM’s api is not supported and will be supported in #737 |
✅ @tyrantGXY This issue is closed, If you have any questions, you can comment and reply. |
🎉 This issue has been resolved in version 0.123.0 🎉 The release is available on: Your semantic-release bot 📦🚀 |
💻 系统环境
Windows
📦 部署环境
Official Preview
🌐 浏览器
Chrome
🐛 问题描述
我使用oneapi,里面渠道填好了chatglm的api和模型,但是在lobechat中,无法使用。自定义模型使用后,返回非法参数错误
🚦 期望结果
希望能增加支持
📷 复现步骤
No response
📝 补充信息
No response
The text was updated successfully, but these errors were encountered: