Skip to content

OllamaStraicoAPIProxy implements the same Ollama/LM Studio API endpoints but redirects the requests to the Straico API Server

License

Notifications You must be signed in to change notification settings

lucasacchiricciardi/ollama-straico-apiproxy

 
 

Repository files navigation

Ollama Straico API Proxy

Python Docker Build and Push Docker Images

Project Description

StraicoAPIProxy implements the same API endpoints but redirects the requests to the Straico API Server. We now support Ollama, LMStudio/OpenAI and Anthropic Claude API Endpoints.

This allows you to use any application that supports Ollama, LMStudio/OpenAI and Anthropic Claude while leveraging Straico's available cloud LLM models instead of running a local LLM.

Disclaimer: This is not an official Ollama or Straico product.

Setup

Please follow the Setup Guide.

Usage

Once the container is running, you can use any Ollama, LMStudio/OpenAI and Anthropic Claude-compatible application by pointing it to the proxy base url. By default the port is 11434 unless modified in the docker-compose.yml file.

Base URL for each provider

  1. Ollama
  2. LMStudio/OpenAI
  3. Anthropic Claude

API Endpoints

List and describe the main API endpoints here.

Ollama

  1. /api/generate
  2. /api/chat
  3. /api/tags
  4. /api/embeddings

LM Studio

  1. /v1/chat/completions
    • alias: /chat/completions
  2. /v1/completions
  3. /v1/models
  4. /v1/embeddings

OpenAI

  1. /v1/chat/completions
  2. /v1/audio/speech
  3. /v1/images/generations

Anthopic Claude

  1. /v1/messages

Known Working Integrations

OllamaStraicoAPIProxy has been tested and confirmed to work with the following applications and integrations:

  1. Home Assistant

    • Integration: Ollama for Home Assistant
    • Description: Use OllamaStraicoAPIProxy with Home Assistant for AI-powered home automation tasks.
  2. Logseq

    • Plugin: ollama-logseq
    • Description: Integrate OllamaStraicoAPIProxy with Logseq for enhanced note-taking and knowledge management.
  3. Obsidian

    • Plugin: obsidian-ollama
    • Description: Use OllamaStraicoAPIProxy within Obsidian for AI-assisted note-taking and writing.
  4. Snippety

    • Website: https://snippety.app/
    • Description: Leverage OllamaStraicoAPIProxy with Snippety for AI assisted snippet management and generation.
  5. Rivet

  6. Continue.dev

  7. Open WebUI

  8. Flowise

  9. Aider Chat

  10. Cline

  11. Enconvo

Please note that while these integrations have been tested, you may need to adjust settings or configurations to point to your OllamaStraicoAPIProxy instance instead of a local Ollama installation.

To-Do List

  1. Add Tests
  2. Add Documentation

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License.

Acknowledgements

Referal Link to digitalocean

DigitalOcean Referral Badge

About

OllamaStraicoAPIProxy implements the same Ollama/LM Studio API endpoints but redirects the requests to the Straico API Server

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 81.0%
  • HTML 19.0%