Skip to content

Conversation

@maxbrunsfeld
Copy link
Collaborator

I wanted to be able to work offline, so I made it a little bit more convenient to point zeta2 at ollama.

  • For zeta2, don't require that request ids be UUIDs
  • Add an env var ZED_ZETA2_OLLAMA that sets the edit prediction URL and model id to work w/ ollama.

Release Notes:

  • N/A

@cla-bot cla-bot bot added the cla-signed The user has signed the Contributor License Agreement label Nov 10, 2025
@maxbrunsfeld maxbrunsfeld merged commit b8081ad into main Nov 10, 2025
24 checks passed
@maxbrunsfeld maxbrunsfeld deleted the zed-zeta2-ollama branch November 10, 2025 05:10
11happy pushed a commit to 11happy/zed that referenced this pull request Dec 1, 2025
I wanted to be able to work offline, so I made it a little bit more
convenient to point zeta2 at ollama.

* For zeta2, don't require that request ids be UUIDs
* Add an env var `ZED_ZETA2_OLLAMA` that sets the edit prediction URL
and model id to work w/ ollama.

Release Notes:

- N/A
@robertherber
Copy link

Sorry if it's a stupid question but would this work with completions and predictions? And how would I go about to configure it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla-signed The user has signed the Contributor License Agreement

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants