-
Notifications
You must be signed in to change notification settings - Fork 1
Closed
Labels
M20Milestone 20: Provider flexibilityMilestone 20: Provider flexibilityllmLLM provider relatedLLM provider relatedpriority/highHigh priorityHigh prioritysize/M
Description
Epic: #365
Description
Generic LLM provider adapter that works with any OpenAI-compatible API (Groq, Together, Fireworks, DeepSeek, Mistral, xAI, etc.) via configurable base_url.
Design
[llm]
provider = "compatible"
[llm.compatible]
base_url = "https://api.groq.com/openai/v1"
api_key_env = "GROQ_API_KEY"
model = "llama-3.3-70b-versatile"
embedding_model = "nomic-embed-text-v1.5"Tasks
-
CompatibleProviderimplementingLlmProvidertrait - Reuse OpenAI HTTP client logic (chat, stream, embed endpoints)
- Feature-gate under
openai(shares reqwest client) - Config parsing with
base_url,api_key_env,model,embedding_model - Unit tests with mock HTTP server (wiremock)
- Refactor existing
OpenAiProviderto delegate to Compatible internally
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
M20Milestone 20: Provider flexibilityMilestone 20: Provider flexibilityllmLLM provider relatedLLM provider relatedpriority/highHigh priorityHigh prioritysize/M