-
Notifications
You must be signed in to change notification settings - Fork 1
[OPEN-7543] Update OpenAI Wrapper to support Responses API #539
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
[OPEN-7543] Update OpenAI Wrapper to support Responses API #539
Conversation
|
Cursor Agent can help with this pull request. Just |
c588901 to
82f5365
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@viniciusdsmello, this PR is almost good to go! I left a couple of comments in some parts of the code.
Besides them, it would be good if you could squash the commits so that this PR has a single commit (following the convention we discussed).
Thanks!
| @@ -0,0 +1,147 @@ | |||
| { | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No need for this example. This is redundant with openai_tracing.ipynb. I would just add a new cell showing that the same tracer works with the responses API.
| model_parameters=get_responses_model_parameters(kwargs), | ||
| raw_output=response.model_dump() if hasattr(response, "model_dump") else str(response), | ||
| id=inference_id, | ||
| metadata={"api_type": "responses"}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can remove this metadata. Unless there's an intention to leverage it somehow in the platform. From the name of the step, it's already clear that it's a Response call (instead of a chat completion call)
| return result | ||
|
|
||
|
|
||
| def extract_responses_inputs(kwargs: Dict[str, Any]) -> Dict[str, Any]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's an issue with this function. Consequently, the inputs for the OpenAI Response steps are not rendering properly:
For this step, the input should be something like:
{
"prompt": [{"role": "user", "content": "What is 3+3?"}]
}
I believe you're just sending: {"prompt": "What is 3+3?"}.
You just need to double check the Responses API reference to ensure this function works for all input formats (e.g., single user message, multi-turn conversation, etc.)
Pull Request
Summary
This PR adds comprehensive tracing support for OpenAI's new Responses API (
client.responses.create) while maintaining full backward compatibility with the existing Chat Completions API (client.chat.completions.create).Changes
trace_openaiandtrace_async_openaito dynamically detect and patch theresponses.createendpoint.add_to_traceto differentiate between Chat Completions and Responses API calls for improved trace naming and metadata.examples/tracing/openai/responses_api_example.py) demonstrating usage for both APIs, including streaming and function calling.Context
OpenAI's Responses API unifies multiple capabilities (chat, text, tool use, JSON mode) into a single interface, providing improved metadata structure and traceability. This update extends Openlayer's tracing logic to support this new, more aligned endpoint, ensuring that users can leverage the latest OpenAI features with full observability without breaking existing integrations.
Testing