-
-
Notifications
You must be signed in to change notification settings - Fork 530
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added support for the new function calling capability in Chat Completions API #300
Conversation
Betalgo.OpenAI.Utilities
Betalgo.OpenAI.Utilities
If a chunked response contains a function call, the call gets split up just as if it were a message (on whitespace, from the looks of it), and the function name and arguments are returned as separate chunks. And then CreateCompletionAsStream() gets those multiple chunks, but is unable to put that back together because it's not the message being split. It's not clear that splitting up function calls was intended behavior on their end - it splits JSON responses up into separate chunks, that aren't exactly valid by themselves, and it looks difficult to put the result back together. Maybe we put up a warning if we detect that the user is trying to use a streaming function, and they have the Functions list populated? |
I would hate to lose streaming support. With streaming functions the pieces of the JSON are streamed as if they were normal text: { |
It was an unexpected bit of work, but I added support for streaming. During stream processing we detect if a function call is being returned, and if so, a separate handler starts accumulating function call data, and then returns it as one packed when it's done. I don't think there's a way around this kind of an accumulator, because the whole function call JSON needs to be collected before it can be passed anywhere else (e.g. to a JSON parser). Feedback appreciated! |
By the way, for anyone interested, this is what a streamed function call looks like: Unstreamed:
Streamed:
|
Hi there!
Here's a PR that adds support for new function calling capability in GPT
https://openai.com/blog/function-calling-and-other-api-updates
Includes
Does not include changes to documentation (like the readme file), not sure what's the best place to document this.