Feat: Handle Ollama API server #52
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
First thank you for this project, most of the implementation is done: that's awesome !
Subject
Here is a proposal for handling Ollama API using the OpenAI SDK has specified from the doc.
Implementation
OpenAIA2AServerandOpenAIA2AClientclasses has the implementation is based on OpenAI SDK. So files proposed are respectivelyOllamaA2AServerandOllamaA2AClient. This PR does not support Ollama API key has that's the wayOpenAIA2AServerandOpenAIA2AClientdetect if they need to setup the OpenAI client. This proposal has pros and cons, feel free to request modification if you do not agree with this.Note: My linter has made a lot of noise in
client/llm/openai.pyandclient/streaming.pyfiles. Sorry for that but I've not been able to run yourmake format && make lintwith new poetry commandrun.Associations
Thanks again!