StraicoAPIProxy implements the same API endpoints but redirects the requests to the Straico API Server. We now support Ollama, LMStudio/OpenAI and Anthropic Claude API Endpoints.
This allows you to use any application that supports Ollama, LMStudio/OpenAI and Anthropic Claude while leveraging Straico's available cloud LLM models instead of running a local LLM.
Disclaimer: This is not an official Ollama or Straico product.
Please follow the Setup Guide.
Once the container is running, you can use any Ollama, LMStudio/OpenAI and Anthropic Claude-compatible application by pointing it to the proxy base url. By default the port is 11434 unless modified in the docker-compose.yml file.
- Ollama
- LMStudio/OpenAI
- Anthropic Claude
List and describe the main API endpoints here.
- /api/generate
- /api/chat
- /api/tags
- /api/embeddings
- /v1/chat/completions
- alias: /chat/completions
- /v1/completions
- /v1/models
- /v1/embeddings
- /v1/chat/completions
- /v1/audio/speech
- /v1/images/generations
- /v1/messages
OllamaStraicoAPIProxy has been tested and confirmed to work with the following applications and integrations:
-
Home Assistant
- Integration: Ollama for Home Assistant
- Description: Use OllamaStraicoAPIProxy with Home Assistant for AI-powered home automation tasks.
-
Logseq
- Plugin: ollama-logseq
- Description: Integrate OllamaStraicoAPIProxy with Logseq for enhanced note-taking and knowledge management.
-
Obsidian
- Plugin: obsidian-ollama
- Description: Use OllamaStraicoAPIProxy within Obsidian for AI-assisted note-taking and writing.
-
Snippety
- Website: https://snippety.app/
- Description: Leverage OllamaStraicoAPIProxy with Snippety for AI assisted snippet management and generation.
-
Rivet
- Website: https://rivet.ironcladapp.com/
- Description: Allows using Ollama Chat and OpenAI Chat (via LM Studio)
-
Continue.dev
- Website: https://www.continue.dev/
- Description: Generate code using Ollama and LM Studio
-
Open WebUI
- Website: https://docs.openwebui.com/
- Description: Allows using Ollama with Open WebUI
- Sample Configuration: docker-compose.yaml
-
Flowise
- Website: https://flowiseai.com/
- Description: Allows using Ollama with Flowise
- Sample Configuration: docker-compose.yaml
-
Aider Chat
- Website: https://aider.chat/
- Description: Pair Programming / Coding Assistant.
-
Cline
- Website: https://github.com/cline/cline
- Description: Pair Programming / Coding Assistant.
-
Enconvo
- Website: https://www.enconvo.com/
- Description: Desktop app
Please note that while these integrations have been tested, you may need to adjust settings or configurations to point to your OllamaStraicoAPIProxy instance instead of a local Ollama installation.
- Add Tests
- Add Documentation
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License.