Skip to content

M1: Ollama chat loop — interactive CLI with LLM #2

@bug-ops

Description

@bug-ops

Goal

User types a message in CLI, gets a response from Ollama. First proof-of-life for the agent.

Scope

  • LlmProvider trait definition
  • Ollama backend (ollama-rs)
  • Config loading (TOML + env overrides)
  • Agent loop with CLI I/O

Acceptance Criteria

  • cargo run starts interactive session
  • User types message, receives Ollama response
  • Multi-turn conversation within session
  • Ctrl+C exits cleanly
  • Clear error if Ollama unreachable
  • Config from config/default.toml, env var overrides work

Child Issues

Depends On

#1 (M0)

Metadata

Metadata

Assignees

No one assigned

    Labels

    M1Milestone 1: Ollama ChatepicMilestone-level tracking issue

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions