-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement June OpenAI updates #360
Comments
Added GPT3Dot5Turbo0613, GPT3Dot5Turbo16K, GPT40613, and GPT432K0613 models from June update (https://openai.com/blog/function-calling-and-other-api-updates) Issue sashabaranov#360
Added GPT3Dot5Turbo0613, GPT3Dot5Turbo16K, GPT40613, and GPT432K0613 models from June update (https://openai.com/blog/function-calling-and-other-api-updates) Issue #360
Which fields in the refs: https://openai.com/blog/function-calling-and-other-api-updates {
"model":"gpt-3.5-turbo-0613",
"messages":[
{
"role":"user",
"content":"What is the weather like in Boston?"
}
],
"functions":[
{
"name":"get_current_weather",
"description":"Get the current weather in a given location",
"parameters":{
"type":"object",
"properties":{
"location":{
"type":"string",
"description":"The city and state, e.g. San Francisco, CA"
},
"unit":{
"type":"string",
"enum":[
"celsius",
"fahrenheit"
]
}
},
"required":[
"location"
]
}
}
]
} |
The official documentation related to |
https://platform.openai.com/docs/guides/gpt/function-calling „note: the model may generate invalid JSON or hallucinate parameters“ Perhaps we should ensure that we handle this situation properly. |
It would be nice to use an external jsonschema library, a challenge with implementing a new one is the range of features to support, eg JSON Schema supports the 'example' field, even though it's not shown in the OpenAI docs, I suspect that the models would be able to make use of it. That being said, there are many competing libraries and it's probably hard to pick one which everyone would be happy with. I wonder if, since this library only needs to properly marshal the request, if it warrants a reimplementation of a very limited subset of JSON schema features -- eg no validation, no reflection etc, just defining the spec and asserting that inputs conform to that spec. Other options might be to treat the function definition as a WRT handling JSON -- I wrote an opinionated implementation for how to handle generating JSON schemas via reflection and parsing/validating them from string, see https://github.com/stillmatic/gollum/blob/main/functions_test.go#L133. The implementation itself is quite terse (~50 lines) but does make a fair number of assumptions (e..g that you have a |
I agree with this. It can be quite simple, and leave the details up to the end user. At the very minimum we could have the JSON schema input just be a []byte that the user has to prepare elsewhere, using whichever library they prefer. The returned function call could be exposed as a simple interface{} or []byte, and leave it up to the user to unmarshal the JSON themselves and decide how to handle any related errors. |
I saw that there was a new release (v1.11.0) with function calls implemented. However, they are not accessible in streaming responses. I couldn't find official OpenAI docs on this but this seems to address the issue. I think this is an easy addition, *FunctionCall should be added as a field to ChatCompletionStreamChoiceDelta as far as I can tell |
This PR #373 feels like a good example of why implementing JSONSchema is tricky. using I have a tested implementation here, which shows that you can successfully call OpenAI with the bytes array: https://github.com/stillmatic/gollum/commit/b10c270cc853054e0d8172ed1bd94c548f343b63/ |
@stillmatic @jmacwhyte @sashabaranov |
Issue with
|
@robinbraemer Hey, we've just merged an update that makes it Please see the tests for examples. We'll need to add some README examples for this! |
Seems to work! |
These features have already been released, so I'm closing this issue. |
See updates at: https://openai.com/blog/function-calling-and-other-api-updates
From the top of my mind
*-0613
models (e.g.gpt-4-0613
)The text was updated successfully, but these errors were encountered: