We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Groq has added Llama3.3 and Llama3.3 Spec Decode to list of its models. Adding those models to Dify
https://console.groq.com/docs/models
The text was updated successfully, but these errors were encountered:
Added new models and Removed the deleted ones for Groq #11455 (#11456)
6478aa1
Co-authored-by: crazywoola <427733928@qq.com> Co-authored-by: Alok Shrivastwa <Alok.Shrivastwa@microland.com>
84d966d
Merge branch 'lindorm-vdb' of github.com:AlwaysBluer/dify into lindor…
179dfe9
…m-vdb * 'lindorm-vdb' of github.com:AlwaysBluer/dify: Fix/pdf preview in build (langgenius#11621) feat(devcontainer): add alias to stop Docker containers (langgenius#11616) ci: better print version for ruff to check the change (langgenius#11587) feat(model): add vertex_ai Gemini 2.0 Flash Exp (langgenius#11604) fix: name of llama-3.3-70b-specdec (langgenius#11596) Added new models and Removed the deleted ones for Groq langgenius#11455 (langgenius#11456) [ref] use one method to get boto client for aws bedrock (langgenius#11506) chore: translate i18n files (langgenius#11577) fix: support mdx files close langgenius#11557 (langgenius#11565) fix: change workflow trace id (langgenius#11585) Feat: dark mode for logs and annotations (langgenius#11575) Lindorm vdb (langgenius#11574) feat: add gemini-2.0-flash-exp (langgenius#11570) fix: better opendal tests (langgenius#11569) Fix: RateLimit requests were not released when a streaming generation exception occurred (langgenius#11540) chore: translate i18n files (langgenius#11545) fix: workflow continue on error doc link (langgenius#11554)
Successfully merging a pull request may close this issue.
Self Checks
1. Is this request related to a challenge you're experiencing? Tell me about your story.
Groq has added Llama3.3 and Llama3.3 Spec Decode to list of its models. Adding those models to Dify
2. Additional context or comments
https://console.groq.com/docs/models
3. Can you help us with this feature?
The text was updated successfully, but these errors were encountered: