Skip to content

Support Responses API #138

@enyst

Description

@enyst

Related to

LLM providers have implemented ChatCompletions widely, with various wrinkles.
liteLLM offers an openai-compatible API to many providers.

Now we use litellm with chatcompletions in agent-sdk. On agent-sdk side, we want to continue using a simple, openai-compatible call to the LLM, which litellm will forward.

But relatively recently, OpenAI has released a different API: the Responses API. They send reasoning content only via Responses API, not the older ChatCompletions API. So at least for OpenAI models, we need to use Responses.

I can think of two options:

  • we can implement Responses API directly in llm.py, by an adapter converting the messages on the fly
  • or we can make a new llm_responses.py

The first option looks like this:

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions