-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Typing: when stream is completed, delta in ChatCompletionChunk from azure openai is None; should be ChoiceDelta #1677
Comments
@JensMadsen could you share more information to help in reproducing this?
|
@kristapratico I think I have identified what causes the incorrect types. I use the
|
@JensMadsen thanks. Unfortunately, I'm still missing something to reproduce this. Could you share the region your resource resides in and/or the prompt that causes this? edit: Do you by chance have a custom content filter applied to the deployment with asynchronous filtering enabled? |
Yes, of course. Region: Sweden I see this with all prompts so far. Again, using the older API |
@JensMadsen got it. In your screenshot, it does look like the asynchronous content filter is enabled. With the async filter turned on, the Azure response is slightly altered to return information like data: {"id":"","object":"","created":0,"model":"","prompt_annotations":[{"prompt_index":0,"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"choices":[],"usage":null}
data: {"id":"chatcmpl-7rCNsVeZy0PGnX3H6jK8STps5nZUY","object":"chat.completion.chunk","created":1692913344,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"role":"assistant"}}],"usage":null}
data: {"id":"chatcmpl-7rCNsVeZy0PGnX3H6jK8STps5nZUY","object":"chat.completion.chunk","created":1692913344,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":"Color"}}],"usage":null}
data: {"id":"chatcmpl-7rCNsVeZy0PGnX3H6jK8STps5nZUY","object":"chat.completion.chunk","created":1692913344,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":" is"}}],"usage":null}
data: {"id":"chatcmpl-7rCNsVeZy0PGnX3H6jK8STps5nZUY","object":"chat.completion.chunk","created":1692913344,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":" a"}}],"usage":null}
...
data: {"id":"","object":"","created":0,"model":"","choices":[{"index":0,"finish_reason":null,"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}},"content_filter_offsets":{"check_offset":44,"start_offset":44,"end_offset":198}}],"usage":null}
...
data: {"id":"chatcmpl-7rCNsVeZy0PGnX3H6jK8STps5nZUY","object":"chat.completion.chunk","created":1692913344,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":"stop","delta":{}}],"usage":null}
data: {"id":"","object":"","created":0,"model":"","choices":[{"index":0,"finish_reason":null,"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}},"content_filter_offsets":{"check_offset":506,"start_offset":44,"end_offset":571}}],"usage":null}
data: [DONE] I'm following up with the team to try to understand the reason for this difference. You won't see this with the older version (2023-05-15) since content filter annotations weren't added to the API until 2023-06-01-preview and later. It looks like the async filter is still in preview and could be subject to change, so at the moment I think it might be best to write code that is resilient to this API difference. You're absolutely right that the typing is wrong for Azure in this case, but I believe that this discrepancy lies more on the service than the SDK. |
@kristapratico Any progress on this? Makes the SDK unusable with Azure + Async filter as it causes runtime errors. |
@kapis trying to get an update from the team. I was under the impression that the service rolled out a fix for this but can confirm what you're seeing. |
Confirm this is an issue with the Python library and not an underlying OpenAI API
Describe the bug
When streaming from azure open ai API the delta of the choice is None. In the python open ai client v1.42.0 delta is type
ChoiceDelta
i.e. notNone
.To Reproduce
Run this code in line with
The types are:
response_chunk: ChatCompletionChunk
response_chunk.choices: list[Choice]
response_chunk.choices[0].delta: ChoiceDelta
The response from azure open ai API returns
delta=None
when stream endsResponse example:
Code snippets
No response
OS
linux, ubuntu 20.04
Python version
3.12.1
Library version
openai v 1.42.0
The text was updated successfully, but these errors were encountered: