Skip to content

Comments

M1: Add Ollama chat loop with LLM provider abstraction#26

Merged
bug-ops merged 1 commit intomainfrom
feat/m1/ollama-chat-loop
Feb 5, 2026
Merged

M1: Add Ollama chat loop with LLM provider abstraction#26
bug-ops merged 1 commit intomainfrom
feat/m1/ollama-chat-loop

Conversation

@bug-ops
Copy link
Owner

@bug-ops bug-ops commented Feb 5, 2026

Summary

  • LlmProvider trait with native async (no async-trait crate, Edition 2024)
  • OllamaProvider implementation using ollama-rs
  • Config loading from TOML with env var overrides (ZEPH_LLM_PROVIDER, ZEPH_LLM_BASE_URL, ZEPH_LLM_MODEL)
  • Interactive CLI agent loop with multi-turn conversation history
  • Generic Agent<P: LlmProvider> for dyn-safety with native async traits
  • 6 unit tests (config: 3, ollama: 3)

Acceptance Criteria

  • cargo run starts interactive REPL
  • Config loaded from config/default.toml
  • Env vars override config values
  • Multi-turn conversation within session
  • Ctrl+C / "exit" / "quit" exits cleanly
  • cargo clippy -- -D warnings clean
  • cargo test --workspace — 6 tests pass

Closes #8, closes #9, closes #10, closes #11
Relates to #2

LlmProvider trait with native async, OllamaProvider via ollama-rs,
TOML config with env overrides, interactive CLI REPL with multi-turn
conversation history.

Closes #8, closes #9, closes #10, closes #11
@bug-ops bug-ops merged commit 260cc20 into main Feb 5, 2026
@bug-ops bug-ops deleted the feat/m1/ollama-chat-loop branch February 5, 2026 21:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

1 participant