-
Notifications
You must be signed in to change notification settings - Fork 318
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama support #162
Ollama support #162
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The style guide flagged several spelling errors that seemed like false positives. We skipped posting inline suggestions for the following words:
- Ollama
Amazing! I'm just getting on a flight back to the UK, if the Internet is usable, I'll review soon. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Otherwise this looks great! 🎉
Really looking forward to releasing this.
Thanks for your review @samuelcolvin . I have addressed the majority of issues, I just had a couple of questions (left on the unresolved comments) EDIT: I'm investigating the failing CI/lint check. It is working locally |
you need to merge with/rebase against |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The style guide flagged several spelling errors that seemed like false positives. We skipped posting inline suggestions for the following words:
- Pydantic
- Logfire
- FastUI
- [Cc]ollab
@samuelcolvin thanks for the advice. After getting into a bit of a pickle I think I have successfully rebased now. Everything seems to be running successfully. I've made all of the additional changes you have requested, and the following command still works when testing the ollama model. Let me know what else is required if anything before merging.
|
This is awesome, I'll create a new PR to add tests since they require special permissions to run. Thank you so much. |
fix #112.
A PR to implement an
OllamaModel
and allow for easy access to Ollama.A few questions / notes:
ollama.list()
, and I have put a (commented out) example of how this would work in themodels/ollama.py
file, but for now, I have just left asstr
as the typeOpenAIAgentModel
, rather than reimplementing that, but then to get that work with theOllama
model, I have had to allow a much more generalstr
type as well as the much more tightly definedChatModel
type. If you think it makes sense to just fully duplicate theOpenAIAgentModel
for anOllamaAgentModel
and then can have more strict typing for both versions, then just let me knowOtherwise, this works well.
To test it:
docs/install.md#ollama
, install theqwen2.5-coder:latest
model, and then run: