-
Notifications
You must be signed in to change notification settings - Fork 8.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug Report: Issue with Dify AI Version 0.14.1 - OpenAI Bad Request Error (400) #11981
Comments
Oops, it looks like streaming mode isn’t supported for o1 yet. We’ll need to update the model configuration to make it work. |
@laipz8200 Thanks a lot. Can you suggest me how can I do this? I am bit confused with bot's response too. |
same issues, any solution? or just wait until next update? |
Please DM me if developers needs level tier 5 API keys to test o1. I am here to support in any way. |
The same issue, O1 model is not supported. |
In this PR #12037, the logic for identifying the o1 model has been changed from "o1" in model to model.startswith("o1"), which causes incorrect model identification and leads to calling errors. Our model name is "gpt-o1", which doesn't start with "o1", so we need to carefully consider whether to approve this PR. |
The same issue, O1 model is not supported. |
Self Checks
Dify version
0.14.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
Create a chatbot:
Test the chatbot in "Run App" mode:
The chatbot functions without any issues in this mode, suggesting the issue may be specific to preview/test workflows in case of chatbot.
Create a workflow:
✔️ Expected Behavior
❌ Actual Behavior
Description
While using the latest release of Dify AI (
Version 0.14.1
) with OpenAI O1 models, the following error is consistently encountered. Meanwhile I observed that o1-preview works fine.:This issue occurs under specific circumstances (described below) despite the correct setup. The error suggests an incompatibility with the
stream=true
configuration for the model.Steps to Reproduce
Create a chatbot:
Test the chatbot in "Run App" mode:
Create a workflow:
400 Bad Request
error is encountered here as well.Expected Behavior
Actual Behavior
Environment
stream
enabled. But I also tried withblocking
mode with the API, it still did not worked.Additional Details
stream=true
configuration, which the O1 model does not support.stream
parameter might be inconsistent across these modes.Suggested Fix
Update the Dify configuration validation to handle models that do not support streaming (
stream=true
) by:stream=false
for unsupported models.Ensure consistency in behavior between:
Update the documentation to clearly list models and their compatibility with the
stream
parameter.References
The text was updated successfully, but these errors were encountered: