A modern desktop chat interface for Ollama AI models. This application provides a sleek, user-friendly interface for having conversations with locally running Ollama models, similar to ChatGPT but running completely offline.
- 🚀 Cross-platform desktop application (Windows, macOS, Linux)
- 🤖 Support for multiple Ollama models
- 🔄 Automatic model detection and selection
- 💬 Modern chat interface with streaming responses
- 📝 Full Markdown support with syntax highlighting
- 💾 Persistent chat history using localStorage
- 🔄 Conversation context management
- 🎨 Dark mode interface
- ✨ Code block syntax highlighting with line numbers
- 📋 One-click code copying
- 🏷️ Automatic chat title generation
- 📱 Responsive design
Before running this application, make sure you have:
- Node.js (v16 or higher)
- Ollama installed and running locally
- At least one Ollama model pulled (e.g., qwen2.5-coder, llama2, codellama)
- Clone the repository:
git clone https://github.com/yourusername/ollama-ui-chat.git
cd ollama-ui-chat
- Install dependencies:
npm install
- Start the development server:
npm run electron-dev
To create a production build:
npm run electron-pack
The packaged application will be available in the dist
directory.
- Start Ollama on your machine
- Launch Ollama UI Chat
- Click "New Chat" to start a conversation
- Select the model you want to use from the available models
- Type your message and press Enter or click the send button
- The AI will respond, maintaining context throughout the conversation
- Automatic detection of installed Ollama models
- Model selection for each chat session
- Model information display in chat interface
- Support for switching between different models
- Create new chats with model selection
- Delete existing chats
- Edit chat titles
- Automatic title generation based on first message
- Model indicator for each chat
- Full Markdown rendering
- Syntax highlighted code blocks
- One-click code copying
- Message history persistence
- Real-time streaming responses
- Context-aware conversations
- Clean, modern design
- Dark mode
- Responsive layout
- Sidebar navigation with model indicators
- Message streaming
- Model information display
- Local storage for chat history
- Automatic data persistence
- No external database required
- Fast and reliable data access
- Frontend Framework: React with TypeScript
- Desktop Framework: Electron
- UI Components: Material-UI (MUI)
- Code Highlighting: Prism.js
- Markdown Rendering: React Markdown
- Build Tool: Create React App with react-app-rewired
- Storage: Browser's localStorage API
ollama-ui-chat/
├── public/
│ └── electron.js # Electron main process
├── src/
│ ├── components/ # React components
│ ├── services/ # Service layer
│ ├── types/ # TypeScript types
│ └── App.tsx # Main React component
└── package.json # Project configuration
Contributions are welcome! Please feel free to submit a Pull Request. Check out our Contributing Guidelines for more details.
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama for providing the AI model server
- Electron for the desktop application framework
- React for the UI framework
- Material-UI for the component library
If you encounter any issues or have questions, please file an issue on the GitHub repository.