-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Function Parameters Json serialization results in uninspectable chat function input. #326
Comments
Same problem here... the request is serialized with the name of Function class instead of the expected json object.
|
The problem is that the metadata about function parameters is NOT being sent to the model.
and
but that is used when the LLM returns the arguments in ChatFunctionCall
I think what is missing is sending in the parameter metadata to the LLM like so:
In the above, the The |
I am running into issues on the GPT function selection, taking the API spec from : https://platform.openai.com/docs/guides/gpt/function-calling and using jackson to serialize the full prompt with functions etc, the json for the parameters of the function lists the class name, and not the parameters of the class - even though they are annotated as per the example on the read me.
Assuming I am doing something wrong, I created a quick test around the OpenAiApiFunctionsExample to run in a test and also printed out the serialized chat message:
Same problem occurs as I see in my own code, break pointing on the OpenAIService.java and inspecting the request object the parameter descriptions are not in the object being sent to the OpenAI Service but the class name is.
The response comes back from OpenAI as : Response: ChatMessage(role=assistant, content=null, name=null, functionCall=ChatFunctionCall(name=get_weather, arguments={"location":"New York","unit":"CELSIUS"}))Trying to execute get_weather...
So it has to be seeing the object parameters, but they are not visible for inspection within the code? That makes verifying the input provided to the API tricky.
Addendum - the docs use the example
public class Weather {
- but the class in the code is static and appears to required to be static to correctly operate.The text was updated successfully, but these errors were encountered: