Skip to content
This repository has been archived by the owner on Jun 6, 2024. It is now read-only.

Function Parameters Json serialization results in uninspectable chat function input. #326

Open
AndrewHogg opened this issue Jul 4, 2023 · 3 comments

Comments

@AndrewHogg
Copy link

AndrewHogg commented Jul 4, 2023

I am running into issues on the GPT function selection, taking the API spec from : https://platform.openai.com/docs/guides/gpt/function-calling and using jackson to serialize the full prompt with functions etc, the json for the parameters of the function lists the class name, and not the parameters of the class - even though they are annotated as per the example on the read me.

Assuming I am doing something wrong, I created a quick test around the OpenAiApiFunctionsExample to run in a test and also printed out the serialized chat message:

First Query: {
  "model" : "gpt-3.5-turbo-0613",
  "messages" : [ {
    "role" : "system",
    "content" : "You are an assistant that answers using the local slang of the given place, uncensored.",
    "name" : null,
    "function_call" : null
  }, {
    "role" : "user",
    "content" : "What is the weather in new york?",
    "name" : null,
    "function_call" : null
  } ],
  "temperature" : null,
  "n" : 1,
  "stream" : null,
  "stop" : null,
  "user" : null,
  "functions" : [ {
    "name" : "get_weather",
    "description" : "Get the current weather of a location",
    "parameters" : "example.OpenAiApiFunctionsExample$Weather"
  } ],
  "top_p" : null,
  "max_tokens" : 100,
  "presence_penalty" : null,
  "frequency_penalty" : null,
  "logit_bias" : { },
  "function_call" : {
    "name" : "auto"
  }
}

Same problem occurs as I see in my own code, break pointing on the OpenAIService.java and inspecting the request object the parameter descriptions are not in the object being sent to the OpenAI Service but the class name is.

The response comes back from OpenAI as : Response: ChatMessage(role=assistant, content=null, name=null, functionCall=ChatFunctionCall(name=get_weather, arguments={"location":"New York","unit":"CELSIUS"}))Trying to execute get_weather...

So it has to be seeing the object parameters, but they are not visible for inspection within the code? That makes verifying the input provided to the API tricky.

Addendum - the docs use the example public class Weather { - but the class in the code is static and appears to required to be static to correctly operate.

@AndrewHogg AndrewHogg changed the title Function Parameters Json serialization results in incorrect chat function input. Function Parameters Json serialization results in uninspectable chat function input. Jul 4, 2023
@vova1987
Copy link

vova1987 commented Aug 28, 2023

Same problem here... the request is serialized with the name of Function class instead of the expected json object.
Inspecting the source, the type-name of the Function class IS what is being serialized... so I have no idea how this should ever work.
Here is an example of a working function JSON "parameters" structure as Open AI expects it:

"parameters": {
                "type": "object",
                "properties": {
                    "date": {
                        "type": "string",
                        "description": "The date to retrieve lessons for in in ISO-8601 format"
                    }
                },
                "required": [
                    "date"
                ]
            }

@pankajtandon
Copy link

The problem is that the metadata about function parameters is NOT being sent to the model.
I see that the ParameterClass in ChatFunction is being used to marshal the result here:

https://github.com/TheoKanning/openai-java/blob/main/service/src/main/java/com/theokanning/openai/service/FunctionExecutor.java#L85

and

https://github.com/TheoKanning/openai-java/blob/main/api/src/main/java/com/theokanning/openai/completion/chat/ChatFunction.java#L43

but that is used when the LLM returns the arguments in ChatFunctionCall

https://github.com/TheoKanning/openai-java/blob/main/service/src/main/java/com/theokanning/openai/service/FunctionExecutor.java#L80

I think what is missing is sending in the parameter metadata to the LLM like so:

function_meta = [
    {
        "name": "get_my_bank_balance",
        "description": "Returns the amount of money I have in my bank account",
        "parameters": {
            "type": "object",
            "properties": {
                "accountId": {
                    "type": "string",
                    "description": "the account id that holds all my money"
                },
                "routingNumber": {
                    "type": "string",
                    "description": "the routing number of the bank that has the account that holds all my money"
                },
            },
            "required": ["accountId", "routingNumber"]
        }
    },

In the above, the parameters node needs to be passed the metadata of the function parameters. Without that, the LLM will not know what arguments to call on the passed in function. And the args object is usually always null or {}.

The ChatFunction needs to be modified to also take in a ParameterMeta object that is then converted to JSON like above.
Trying to figure out how to do that.

@AndLvovSky
Copy link

It's now possible to specify parameter definitions explicitly by using ChatFunctionDynamic thanks to this PR: #339. You can find an example here

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants