-
Notifications
You must be signed in to change notification settings - Fork 388
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mistral Support #173
Mistral Support #173
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
otherwise looks like a good start.
c460987
to
3b1ad11
Compare
I've got this merged with main, just let me merge it with your latest pushes |
You should definitely check the changes I made, in particular commenting out But I figured I would probably be faster to set up the model settings stuff and deal with the refactor to the new unified |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The style guide flagged several spelling errors that seemed like false positives. We skipped posting inline suggestions for the following words:
- Groq
Thanks so much! |
This pull request introduces integration with the Mistral model for Pydantic AI.
The MistralModel class leverages the Mistral Python client to interact with the Mistral API, enabling both streaming and non-streaming requests and Structured Responses.
This integration supports various modes of operation, including function calling, JSON mode, and stream mode, based on the presence of function and result tools.
Note: Mistral does not support Streaming on Function Calling or Structured Responses.
Even when using stream_async, the behavior is not truly streaming. After discussions on the Mistral Discord, I use json_mode to stream only the structured output.
Sample Notebook
Test: >> Working on it. <<