Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement June OpenAI updates #360

Closed
1 of 2 tasks
sashabaranov opened this issue Jun 13, 2023 · 12 comments
Closed
1 of 2 tasks

Implement June OpenAI updates #360

sashabaranov opened this issue Jun 13, 2023 · 12 comments
Labels
enhancement New feature or request

Comments

@sashabaranov
Copy link
Owner

sashabaranov commented Jun 13, 2023

See updates at: https://openai.com/blog/function-calling-and-other-api-updates

From the top of my mind

  • Implement support for *-0613 models (e.g. gpt-4-0613)
  • Implement function calling
simonklee added a commit to simonklee/go-openai that referenced this issue Jun 13, 2023
Added GPT3Dot5Turbo0613, GPT3Dot5Turbo16K, GPT40613, and GPT432K0613
models from June update
(https://openai.com/blog/function-calling-and-other-api-updates)

Issue sashabaranov#360
sashabaranov pushed a commit that referenced this issue Jun 13, 2023
Added GPT3Dot5Turbo0613, GPT3Dot5Turbo16K, GPT40613, and GPT432K0613
models from June update
(https://openai.com/blog/function-calling-and-other-api-updates)

Issue #360
@vvatanabe
Copy link
Collaborator

Which fields in the functions are required or optional? It may be necessary to consider, for example, making the type pointer depending on the field.

refs: https://openai.com/blog/function-calling-and-other-api-updates

{
   "model":"gpt-3.5-turbo-0613",
   "messages":[
      {
         "role":"user",
         "content":"What is the weather like in Boston?"
      }
   ],
   "functions":[
      {
         "name":"get_current_weather",
         "description":"Get the current weather in a given location",
         "parameters":{
            "type":"object",
            "properties":{
               "location":{
                  "type":"string",
                  "description":"The city and state, e.g. San Francisco, CA"
               },
               "unit":{
                  "type":"string",
                  "enum":[
                     "celsius",
                     "fahrenheit"
                  ]
               }
            },
            "required":[
               "location"
            ]
         }
      }
   ]
}

@vvatanabe
Copy link
Collaborator

vvatanabe commented Jun 14, 2023

The official documentation related to functions has been updated.
https://platform.openai.com/docs/api-reference/chat/create#chat/create-functions

@danielchristianschroeter

https://platform.openai.com/docs/guides/gpt/function-calling

„note: the model may generate invalid JSON or hallucinate parameters“

Perhaps we should ensure that we handle this situation properly.

@stillmatic
Copy link
Contributor

stillmatic commented Jun 14, 2023

It would be nice to use an external jsonschema library, a challenge with implementing a new one is the range of features to support, eg JSON Schema supports the 'example' field, even though it's not shown in the OpenAI docs, I suspect that the models would be able to make use of it. That being said, there are many competing libraries and it's probably hard to pick one which everyone would be happy with.

I wonder if, since this library only needs to properly marshal the request, if it warrants a reimplementation of a very limited subset of JSON schema features -- eg no validation, no reflection etc, just defining the spec and asserting that inputs conform to that spec. Other options might be to treat the function definition as a json.RawMessage and make the user explicitly responsible for encoding it.

WRT handling JSON -- I wrote an opinionated implementation for how to handle generating JSON schemas via reflection and parsing/validating them from string, see https://github.com/stillmatic/gollum/blob/main/functions_test.go#L133. The implementation itself is quite terse (~50 lines) but does make a fair number of assumptions (e..g that you have a FunctionInput struct for each function). IMO this logic is very useful but makes too many assumptions for a base library. The test file also shows an implementation of the extended ChatCompletionRequest interface, where I embedded the existing one and just added the new fields. There does not need to be many changes to the core library to support functions, basically none of the existing logic needs to be changed, just updating the I/O formats and enums IMO

@jmacwhyte
Copy link
Contributor

There does not need to be many changes to the core library to support functions, basically none of the existing logic needs to be changed, just updating the I/O formats and enums IMO

I agree with this. It can be quite simple, and leave the details up to the end user. At the very minimum we could have the JSON schema input just be a []byte that the user has to prepare elsewhere, using whichever library they prefer.

The returned function call could be exposed as a simple interface{} or []byte, and leave it up to the user to unmarshal the JSON themselves and decide how to handle any related errors.

@cem-unuvar
Copy link
Contributor

I saw that there was a new release (v1.11.0) with function calls implemented. However, they are not accessible in streaming responses. I couldn't find official OpenAI docs on this but this seems to address the issue. I think this is an easy addition, *FunctionCall should be added as a field to ChatCompletionStreamChoiceDelta as far as I can tell

@stillmatic
Copy link
Contributor

This PR #373 feels like a good example of why implementing JSONSchema is tricky. using json.RawMessage in the call should work, and will be more resilient to all the little edgecases to handle.

I have a tested implementation here, which shows that you can successfully call OpenAI with the bytes array: https://github.com/stillmatic/gollum/commit/b10c270cc853054e0d8172ed1bd94c548f343b63/

@vvatanabe
Copy link
Collaborator

@stillmatic @jmacwhyte @sashabaranov
Congrats! v1.11.3 has been released with #377. I've learned a lot from our discussions. Thank you!

@robinbraemer
Copy link

robinbraemer commented Jun 22, 2023

Issue with ChatCompletionRequest.FunctionCall

As by OpenAI docs:
function_call is a string or object (Optional)

In the second case, we can't pass an object like:

FunctionCall: `{"name": "extracted_offers_data"}`,

Since it would marshal into an escaped JSON string instead of a JSON object.

OpenAI returns:

error, status code: 400, message: '$.function_call' is invalid. Please check the API reference: https://platform.openai.com/docs/api-reference.

@sashabaranov
Copy link
Owner Author

@robinbraemer Hey, we've just merged an update that makes it any type. Could you please try with the v1.11.3?

Please see the tests for examples. We'll need to add some README examples for this!

@robinbraemer
Copy link

robinbraemer commented Jun 22, 2023

v1.11.3

Seems to work!

@vvatanabe vvatanabe added the enhancement New feature or request label Jun 30, 2023
@vvatanabe
Copy link
Collaborator

These features have already been released, so I'm closing this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

7 participants