Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: support json_schema for ollama models #11449

Merged
merged 1 commit into from
Dec 8, 2024
Merged

Conversation

hjlarry
Copy link
Contributor

@hjlarry hjlarry commented Dec 7, 2024

Summary

Tip

Close issue syntax: Fixes #<issue number> or Resolves #<issue number>, see documentation for more details.

offical doc https://ollama.com/blog/structured-outputs

Screenshots

8fd967a8670f1b6ed4db69b99e55097

Checklist

Important

Please review the checklist below before submitting your pull request.

  • This change requires a documentation update, included: Dify Document
  • I understand that this PR may be closed in case there was no previous discussion or issues. (This doesn't apply to typos!)
  • I've added a test for each change that was introduced, and I tried as much as possible to make a single atomic change.
  • I've updated the documentation accordingly.
  • I ran dev/reformat(backend) and cd web && npx lint-staged(frontend) to appease the lint gods

@dosubot dosubot bot added size:S This PR changes 10-29 lines, ignoring generated files. ⚙️ feat:model-runtime labels Dec 7, 2024
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Dec 8, 2024
@crazywoola crazywoola merged commit 7e1184c into langgenius:main Dec 8, 2024
5 checks passed
@bowenliang123
Copy link
Contributor

Good job, my thumbs up for it !
Could you add this feature for OpenAI-compatible providers as well ?

@hjlarry
Copy link
Contributor Author

hjlarry commented Dec 22, 2024

Good job, my thumbs up for it ! Could you add this feature for OpenAI-compatible providers as well ?

sure! Is there any model for test this feature?

@bowenliang123
Copy link
Contributor

Two approaches in mind:

  • use ollama's endpoint in OpenAI compatible provider with Qwen 2.5 models
  • use SiliconFlow's models in OpenAI compatible provider

@hjlarry
Copy link
Contributor Author

hjlarry commented Dec 23, 2024

I check the SiliconFlow's doc https://docs.siliconflow.cn/guides/json-mode
seems it only support json_object , not support json_schema ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
⚙️ feat:model-runtime lgtm This PR has been approved by a maintainer size:S This PR changes 10-29 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants