Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug Report: Issue with Dify AI Version 0.14.1 - OpenAI Bad Request Error (400) #11981

Closed
5 tasks done
officialsuyogdixit opened this issue Dec 23, 2024 · 7 comments · Fixed by #12839
Closed
5 tasks done
Labels
🐞 bug Something isn't working good first issue Good first issue for newcomers

Comments

@officialsuyogdixit
Copy link

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.14.1

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Create a chatbot:

  1. Configure it to use the latest O1 model.
  2. Try using the chatbot in the Preview Panel.
  3. The error is triggered immediately in this mode.

Test the chatbot in "Run App" mode:
The chatbot functions without any issues in this mode, suggesting the issue may be specific to preview/test workflows in case of chatbot.

Create a workflow:

  1. Configure a workflow using the same O1 model.
  2. Test the workflow in the workflow editor or through the API.
  3. The same 400 Bad Request error is encountered here as well.
Screenshot 2024-12-23 at 8 12 38 AM

✔️ Expected Behavior

  • The chatbot should function seamlessly in both the Preview Panel and Run App modes.
  • The workflow should execute without error, including when tested via the API.

❌ Actual Behavior

Description

While using the latest release of Dify AI (Version 0.14.1) with OpenAI O1 models, the following error is consistently encountered. Meanwhile I observed that o1-preview works fine.:

[openai] Bad Request Error, Error code:
400 - {'error': {'message': "Unsupported value: 'stream' does not support true with this model. Supported values are: false.,
'type': 'invalid_request_error', 'param':
'stream', 'code': 'unsupported_value'}}

This issue occurs under specific circumstances (described below) despite the correct setup. The error suggests an incompatibility with the stream=true configuration for the model.

Steps to Reproduce

  1. Create a chatbot:

    • Configure it to use the latest O1 model.
    • Try using the chatbot in the Preview Panel.
    • The error is triggered immediately in this mode.
  2. Test the chatbot in "Run App" mode:

    • The chatbot functions without any issues in this mode, suggesting the issue may be specific to preview/test workflows.
  3. Create a workflow:

    • Configure a workflow using the same O1 model.
    • Test the workflow in the workflow editor or through the API.
    • The same 400 Bad Request error is encountered here as well.

Expected Behavior

  • The chatbot should function seamlessly in both the Preview Panel and Run App modes.
  • The workflow should execute without error, including when tested via the API.

Actual Behavior

  • The chatbot fails in the Preview Panel but works in the Run App mode.
  • The workflow fails consistently, both in the editor and through API testing, throwing the following error:
    [openai] Bad Request Error, Error code:
    400 - {'error': {'message': "Unsupported value: 'stream' does not support true with this model. Supported values are: false.,
    'type': 'invalid_request_error', 'param':
    'stream', 'code': 'unsupported_value'}}
    

Environment

  • Dify Version: 0.14.1
  • OpenAI Model: O1 (latest)
  • Configuration: Default configurations with stream enabled. But I also tried with blocking mode with the API, it still did not worked.

Additional Details

  • The error appears to be related to the stream=true configuration, which the O1 model does not support.
  • The discrepancy between Preview Panel/Workflow testing and Run App functionality indicates that the handling of the stream parameter might be inconsistent across these modes.

Suggested Fix

  1. Update the Dify configuration validation to handle models that do not support streaming (stream=true) by:

    • Automatically setting stream=false for unsupported models.
    • Providing clear error messages and configuration guidance in the UI.
  2. Ensure consistency in behavior between:

    • Preview Panel and Run App modes.
    • Workflow Editor and Workflow API Testing.
  3. Update the documentation to clearly list models and their compatibility with the stream parameter.

References

@dosubot dosubot bot added the 🐞 bug Something isn't working label Dec 23, 2024
@crazywoola crazywoola added the good first issue Good first issue for newcomers label Dec 23, 2024
@laipz8200 laipz8200 removed their assignment Dec 23, 2024
@laipz8200
Copy link
Member

Oops, it looks like streaming mode isn’t supported for o1 yet. We’ll need to update the model configuration to make it work.

@officialsuyogdixit
Copy link
Author

@laipz8200 Thanks a lot. Can you suggest me how can I do this? I am bit confused with bot's response too.

@DemonDamon
Copy link

same issues, any solution? or just wait until next update?

@langgenius langgenius deleted a comment from dosubot bot Jan 1, 2025
@officialsuyogdixit
Copy link
Author

Please DM me if developers needs level tier 5 API keys to test o1. I am here to support in any way.

@OwenTest
Copy link

OwenTest commented Jan 6, 2025

The same issue, O1 model is not supported.

@Kevin9703
Copy link
Contributor

Kevin9703 commented Jan 13, 2025

In this PR #12037, the logic for identifying the o1 model has been changed from "o1" in model to model.startswith("o1"), which causes incorrect model identification and leads to calling errors. Our model name is "gpt-o1", which doesn't start with "o1", so we need to carefully consider whether to approve this PR.

#10593

@wangshiyang
Copy link

The same issue, O1 model is not supported.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working good first issue Good first issue for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants