Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: add the missing abab6.5t-chat model of Minimax #11484

Merged
merged 1 commit into from
Dec 9, 2024

Conversation

acelyc111
Copy link
Contributor

@acelyc111 acelyc111 commented Dec 9, 2024

Summary

There is a LLM model named abab6.5t-chat in api/core/model_runtime/model_providers/minimax/llm, but it's missing in api/core/model_runtime/model_providers/minimax/llm/llm.py, so if select abab6.5t-chat model and run the application, a KeyError will be threw.

Screenshots

Checklist

Important

Please review the checklist below before submitting your pull request.

  • This change requires a documentation update, included: Dify Document
  • I understand that this PR may be closed in case there was no previous discussion or issues. (This doesn't apply to typos!)
  • I've added a test for each change that was introduced, and I tried as much as possible to make a single atomic change.
  • I've updated the documentation accordingly.
  • I ran dev/reformat(backend) and cd web && npx lint-staged(frontend) to appease the lint gods

@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label Dec 9, 2024
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Dec 9, 2024
@crazywoola
Copy link
Member

crazywoola commented Dec 9, 2024

cc @laipz8200 @Yeuoly

@crazywoola crazywoola merged commit 32f8439 into langgenius:main Dec 9, 2024
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
⚙️ feat:model-runtime lgtm This PR has been approved by a maintainer size:XS This PR changes 0-9 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants