-
Notifications
You must be signed in to change notification settings - Fork 152
Closed
Description
Related to
LLM providers have implemented ChatCompletions widely, with various wrinkles.
liteLLM offers an openai-compatible API to many providers.
Now we use litellm with chatcompletions in agent-sdk. On agent-sdk side, we want to continue using a simple, openai-compatible call to the LLM, which litellm will forward.
But relatively recently, OpenAI has released a different API: the Responses API. They send reasoning content only via Responses API, not the older ChatCompletions API. So at least for OpenAI models, we need to use Responses.
I can think of two options:
- we can implement Responses API directly in llm.py, by an adapter converting the messages on the fly
- or we can make a new llm_responses.py
The first option looks like this:
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels