Skip to content

Implement CompatibleProvider for OpenAI-compatible APIs #375

@bug-ops

Description

@bug-ops

Epic: #365

Description

Generic LLM provider adapter that works with any OpenAI-compatible API (Groq, Together, Fireworks, DeepSeek, Mistral, xAI, etc.) via configurable base_url.

Design

[llm]
provider = "compatible"

[llm.compatible]
base_url = "https://api.groq.com/openai/v1"
api_key_env = "GROQ_API_KEY"
model = "llama-3.3-70b-versatile"
embedding_model = "nomic-embed-text-v1.5"

Tasks

  • CompatibleProvider implementing LlmProvider trait
  • Reuse OpenAI HTTP client logic (chat, stream, embed endpoints)
  • Feature-gate under openai (shares reqwest client)
  • Config parsing with base_url, api_key_env, model, embedding_model
  • Unit tests with mock HTTP server (wiremock)
  • Refactor existing OpenAiProvider to delegate to Compatible internally

Metadata

Metadata

Assignees

No one assigned

    Labels

    M20Milestone 20: Provider flexibilityllmLLM provider relatedpriority/highHigh prioritysize/M

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions