Skip to content

hasbegun/ollama_chat

Repository files navigation

Ollama Chat

Flutter UI for local Ollama API

Usage

Launch Ollama desktop app or run ollama serve.

The OllamaClient attempts to retrieve the OLLAMA_BASE_URL from the environment variables, defaulting to http://127.0.0.1:11434/api if it is not set.

Platforms

  • Macos
  • Windows
  • Linux
  • Web

Features

  • generate a chat completion
  • list models
  • show model information
  • pull a model
  • update a model
  • delete a model
  • Chat history
  • temperature & model options
  • create a model (modelFile)
  • prompt templates library
  • ollama settings customization

Screenshots






This project is inspired by Dauillama (https://github.com/rxlabz/dauillama)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published