Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama support #162

Merged
merged 26 commits into from
Dec 8, 2024
Merged

Ollama support #162

merged 26 commits into from
Dec 8, 2024

Conversation

cal859
Copy link
Contributor

@cal859 cal859 commented Dec 6, 2024

fix #112.

A PR to implement an OllamaModel and allow for easy access to Ollama.

A few questions / notes:

  • The main issue with Ollama is the typing of the model names. Ollama has 100's of different models and many more are added daily/weekly, so maintaining a complete list (like for Gemini) seems like it will be quite difficult. It is possible to pull the models available to the current user, using ollama.list(), and I have put a (commented out) example of how this would work in the models/ollama.py file, but for now, I have just left as str as the type
  • The second issue with the above typing is, because there is so much overlap with this code and the openai code, I didn't want to completely replicate the code between them, so I am using the existing OpenAIAgentModel, rather than reimplementing that, but then to get that work with the Ollama model, I have had to allow a much more general str type as well as the much more tightly defined ChatModel type. If you think it makes sense to just fully duplicate the OpenAIAgentModel for an OllamaAgentModel and then can have more strict typing for both versions, then just let me know

Otherwise, this works well.

To test it:

  • Follow the setup instructions added to docs/install.md#ollama, install the qwen2.5-coder:latest model, and then run:
PYDANTIC_AI_MODEL=ollama:qwen2.5-coder:latest uv run -m pydantic_ai_examples.pydantic_model 

Copy link
Contributor

@hyperlint-ai hyperlint-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The style guide flagged several spelling errors that seemed like false positives. We skipped posting inline suggestions for the following words:

  • Ollama

@cal859 cal859 mentioned this pull request Dec 6, 2024
@samuelcolvin
Copy link
Member

Amazing! I'm just getting on a flight back to the UK, if the Internet is usable, I'll review soon.

Copy link
Member

@samuelcolvin samuelcolvin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Otherwise this looks great! 🎉

Really looking forward to releasing this.

pydantic_ai_slim/pydantic_ai/models/__init__.py Outdated Show resolved Hide resolved
docs/api/models/ollama.md Outdated Show resolved Hide resolved
pydantic_ai_slim/pydantic_ai/models/ollama.py Outdated Show resolved Hide resolved
pydantic_ai_slim/pydantic_ai/models/ollama.py Outdated Show resolved Hide resolved
docs/install.md Outdated Show resolved Hide resolved
docs/install.md Outdated Show resolved Hide resolved
pydantic_ai_slim/pydantic_ai/models/__init__.py Outdated Show resolved Hide resolved
pydantic_ai_slim/pydantic_ai/models/__init__.py Outdated Show resolved Hide resolved
pydantic_ai_slim/pydantic_ai/models/openai.py Outdated Show resolved Hide resolved
@cal859
Copy link
Contributor Author

cal859 commented Dec 8, 2024

Thanks for your review @samuelcolvin .

I have addressed the majority of issues, I just had a couple of questions (left on the unresolved comments)

EDIT: I'm investigating the failing CI/lint check. It is working locally

@cal859 cal859 requested a review from samuelcolvin December 8, 2024 17:15
@samuelcolvin
Copy link
Member

you need to merge with/rebase against main to get tests passing, do you want me to help?

docs/api/models/ollama.md Outdated Show resolved Hide resolved
pydantic_ai_slim/pydantic_ai/models/ollama.py Outdated Show resolved Hide resolved
pydantic_ai_slim/pydantic_ai/models/ollama.py Outdated Show resolved Hide resolved
pydantic_ai_slim/pydantic_ai/models/ollama.py Outdated Show resolved Hide resolved
pydantic_ai_slim/pydantic_ai/models/ollama.py Outdated Show resolved Hide resolved
pydantic_ai_slim/pydantic_ai/models/ollama.py Outdated Show resolved Hide resolved
pydantic_ai_slim/pydantic_ai/models/ollama.py Outdated Show resolved Hide resolved
@cal859 cal859 requested a review from samuelcolvin December 8, 2024 18:38
Copy link
Contributor

@hyperlint-ai hyperlint-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The style guide flagged several spelling errors that seemed like false positives. We skipped posting inline suggestions for the following words:

  • Pydantic
  • Logfire
  • FastUI
  • [Cc]ollab

@cal859
Copy link
Contributor Author

cal859 commented Dec 8, 2024

you need to merge with/rebase against main to get tests passing, do you want me to help?

@samuelcolvin thanks for the advice. After getting into a bit of a pickle I think I have successfully rebased now. Everything seems to be running successfully. I've made all of the additional changes you have requested, and the following command still works when testing the ollama model. Let me know what else is required if anything before merging.

PYDANTIC_AI_MODEL=ollama:llama3.2 uv run -m pydantic_ai_examples.pydantic_model

@samuelcolvin
Copy link
Member

This is awesome, I'll create a new PR to add tests since they require special permissions to run.

Thank you so much.

@samuelcolvin samuelcolvin merged commit bdf41b6 into pydantic:main Dec 8, 2024
15 checks passed
@samuelcolvin samuelcolvin mentioned this pull request Dec 8, 2024
@cal859 cal859 deleted the feature/ollama-support branch December 8, 2024 22:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support ollama
4 participants