You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When generating a stream of Structured Output from OpenAI's SDK, the response comes parsable. Here is a format of the response where there is both the delta chunk coming from the llm along with the response till now accumulated in the pydantic model provided.
Below given is the format of response streamed by OpenAI:
Here as you can see, we are getting both the delta of content (ChunkEvent.choices[0].delta.content) which is getting generated along with the parsable output till now (inside the ChunkEvent.parsed). This helps me in streaming the response as it comes in structured output. So I am working on a specific use case where I want to stream and parse the response as it comes in Structured Output (without waiting till its completion). Please add this feature to litellm.
Are you a ML Ops Team?
Yes
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
The Feature
When generating a stream of Structured Output from OpenAI's SDK, the response comes parsable. Here is a format of the response where there is both the delta chunk coming from the llm along with the response till now accumulated in the pydantic model provided.
Below given is the format of response streamed by OpenAI:
Motivation, pitch
Here as you can see, we are getting both the delta of content (
ChunkEvent.choices[0].delta.content
) which is getting generated along with the parsable output till now (inside theChunkEvent.parsed
). This helps me in streaming the response as it comes in structured output. So I am working on a specific use case where I want to stream and parse the response as it comes in Structured Output (without waiting till its completion). Please add this feature to litellm.Are you a ML Ops Team?
Yes
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: