-
Notifications
You must be signed in to change notification settings - Fork 1.6k
example @function_tool with optional parameters #43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
When you use @function_tool, it enforces "strict mode" (https://platform.openai.com/docs/guides/function-calling?api-mode=chat#strict-mode), which means params are required. See code/output
prints:
If you specifically don't want strict mode, you can manually create a
output:
|
@rm-openai thanks for the workaround! Since other genai frameworks support optional parameters (via supplying defaults), is it possible to convert this to a feature request? I think that even if at the moment we can work around this, it would be more elegant to be able to express a function naturally. This will make it easier to port code elegantly. WDYT? |
Yes, makes sense. I think the most sensible way to do this is by adding a |
Hey @rm-openai, I’d love to work on implementing this feature. Would you be open to a PR for adding |
Sure thing. Should be straightforward - add |
heh cool, yeah I had WIP but go for it @Jai0401 I'll help you review. edge cases are multiple optional parameters of different types. e.g. also independently test the things you need to change to support this (e.g. in function_schema.py) Have fun! |
@codefromthecrypt - I realized I misread the original schema. This should actually work out of the box with the current SDK. Did you actually run into an issue? |
@rm-openai yep. my original example in the desc creates the following API call. @Jai0401 maybe you can add a unit test about the schema created in your PR, though the tests you have prove the same point I think. So in the current code, a follow-up to this fails as it sees {
"messages": [
{
"role": "user",
"content": "What is the latest version of Elasticsearch?"
}
],
"model": "qwen2.5:0.5b",
"stream": false,
"temperature": 0,
"tools": [
{
"type": "function",
"function": {
"name": "get_latest_elasticsearch_version",
"description": "Returns the latest GA version of Elasticsearch in \"X.Y.Z\" format.",
"parameters": {
"properties": {
"major_version": {
"anyOf": [
{
"type": "integer"
},
{
"type": "null"
}
],
"description": "Major version to filter by (e.g. 7, 8). Defaults to latest",
"title": "Major Version"
}
},
"required": [
"major_version"
],
"title": "get_latest_elasticsearch_version_args",
"type": "object",
"additionalProperties": false
}
}
}
]
} I'm expecting more like some other frameworks, where when it is optional, the wrapped type is used without being required (ideal IMHO), or in worst case keep it defined as an optional, but don't mark it required (e.g. pydantic AI interpretation of the same signature and docstring) {
"messages": [
{
"role": "user",
"content": "What is the latest version of Elasticsearch 8?"
}
],
"model": "qwen2.5:3b",
"n": 1,
"stream": false,
"temperature": 0,
"tool_choice": "auto",
"tools": [
{
"type": "function",
"function": {
"name": "get_latest_elasticsearch_version",
"description": "Returns the latest GA version of Elasticsearch in \"X.Y.Z\" format.",
"parameters": {
"properties": {
"major_version": {
"anyOf": [
{
"type": "integer"
},
{
"type": "null"
}
],
"description": "Major version to filter by (e.g. 7, 8). Defaults to latest",
"title": "Major Version"
}
},
"type": "object",
"additionalProperties": false
}
}
}
]
} Make sense? |
Is there any reason you'd prefer to not use strict mode? The LLM can indeed pass null as you noted. And strict mode basically guarantees valid JSON, which is a big deal. |
The problem is that the LLM isn't passing null. Maybe I worded incorrectly. This only works if I pass a version in my question. If I don't, it breaks out asking me to supply one. |
@codefromthecrypt i think documentation would help there. i.e. one of:
|
Personally, I don't think most functions will describe themselves like this. I don't have to do this in other frameworks, either. Optional, especially when you provide a default, should semantically be optional from the perspective of the LLM. I think you've come around to disagree with this conceptually and I've not been able to convince you to accept non required parameters based on signature. If that's the case, close this out won't fix, and meanwhile thanks for listening. |
FYI while I haven't gotten this to work, the suggestion is to not represent optional parameters in the json schema of the tool call. Rather, use for lack of better terms prompt engineering, to describe how to pass the zero value for the appropriate type. What this implies is a hybrid similar to how certain frameworks support LLMs who don't support tools, so they have to do templating in the prompt to describe the functions etc. What we have to do here is mechanically for each optional parameter, determine and evaluate a means to simulate the parameter being not marked required. If someone gets a general purpose template for this, it is possible we can post-process. Also, if we can hook into the schema generation or replace it, a 3rd party lib could do the same. @Jai0401 if you end up with ideas on this let me know, meanwhile if I do I will. OTOH, this is unexpected heavy lifting for me, so not sure I will. |
This PR introduces a `strict_mode: bool = True` option to `@function_tool`, allowing optional parameters when set to False. This change enables more flexibility while maintaining strict JSON schema validation by default. resolves #43 ## Changes: - Added `strict_mode` parameter to `@function_tool` and passed it to `function_schema` and `FunctionTool`. - Updated `function_schema.py` to respect `strict_mode` and allow optional parameters when set to False. - Added unit tests to verify optional parameters work correctly, including multiple optional params with different types. ## Tests: - Verified function calls with missing optional parameters behave as expected. - Added async tests to validate behavior under different configurations.
Thanks for progressing this @rm-openai It now works! I also verified that if you take off All good follow-up interest, but nothing required cc @Jai0401 Here's my function @function_tool(strict_mode=False)
def get_latest_elasticsearch_version(major_version: int | None = None) -> str:
"""Returns the latest GA version of Elasticsearch in "X.Y.Z" format.
Args:
major_version: Major version to filter by (e.g. 7, 8). Defaults to latest
"""
return "8.17.3"
Now, when I use it mitmproxy shows this json, which is not required, but defined literally as an optional:
```json
{
"messages": [
{
"role": "user",
"content": "What is the latest version of Elasticsearch?"
}
],
"model": "qwen2.5:3b",
"stream": false,
"temperature": 0,
"tools": [
{
"type": "function",
"function": {
"name": "get_latest_elasticsearch_version",
"description": "Returns the latest GA version of Elasticsearch in \"X.Y.Z\" format.",
"parameters": {
"properties": {
"major_version": {
"anyOf": [
{
"type": "integer"
},
{
"type": "null"
}
],
"default": null,
"description": "Major version to filter by (e.g. 7, 8). Defaults to latest",
"title": "Major Version"
}
},
"title": "get_latest_elasticsearch_version_args",
"type": "object"
}
}
}
]
} Now, I have to use qwen2.5:3b, not qwen2.5:0.5b because the smaller doesn't interpret this as a 2c2
< "id": "chatcmpl-823",
---
> "id": "chatcmpl-443",
4,5c4,5
< "created": 1742206965,
< "model": "qwen2.5:0.5b",
---
> "created": 1742207060,
> "model": "qwen2.5:3b",
12c12,23
< "content": "<tool_call>\n{\"name\": \"get_latest_elasticsearch_version\", \"arguments\": null}\n</tool_call>"
---
> "content": "",
> "tool_calls": [
> {
> "id": "call_iu2d5rg9",
> "index": 0,
> "type": "function",
> "function": {
> "name": "get_latest_elasticsearch_version",
> "arguments": "{}"
> }
> }
> ]
14c25
< "finish_reason": "stop"
---
> "finish_reason": "tool_calls" Aside, but what I was originally hoping for was that if the parameter is an optional of an int, it could be translated to a non-required int. The same way as if it were defined like this: @function_tool(strict_mode=False)
def get_latest_elasticsearch_version(major_version: int=0) -> str:
{
"messages": [
{
"role": "user",
"content": "What is the latest version of Elasticsearch?"
}
],
"model": "qwen2.5:3b",
"stream": false,
"temperature": 0,
"tools": [
{
"type": "function",
"function": {
"name": "get_latest_elasticsearch_version",
"description": "Returns the latest GA version of Elasticsearch in \"X.Y.Z\" format.",
"parameters": {
"properties": {
"major_version": {
"description": "Major version to filter by (e.g. 7, 8). Defaults to latest",
"title": "Major Version",
"type": "integer"
}
},
"title": "get_latest_elasticsearch_version_args",
"type": "object"
}
}
}
]
} There is an improvement here, as if you use too small a model, e.g. |
@codefromthecrypt - Sorry would you mind explaining a bit more what's missing? As far as I can tell, if you mark it as |
Tldr if it makes sense to treat int optional = none the same as int =0, let's do it. Regardless, all good. |
This issue is stale because it has been open for 7 days with no activity. |
…enai#60) This PR introduces a `strict_mode: bool = True` option to `@function_tool`, allowing optional parameters when set to False. This change enables more flexibility while maintaining strict JSON schema validation by default. resolves openai#43 ## Changes: - Added `strict_mode` parameter to `@function_tool` and passed it to `function_schema` and `FunctionTool`. - Updated `function_schema.py` to respect `strict_mode` and allow optional parameters when set to False. - Added unit tests to verify optional parameters work correctly, including multiple optional params with different types. ## Tests: - Verified function calls with missing optional parameters behave as expected. - Added async tests to validate behavior under different configurations.
This issue was closed because it has been inactive for 3 days since being marked as stale. |
@rm-openai What if I want some parameters to be required and others to be optional, I can easily do this via pydantic, where any field that I have not wrapped in an optional object will be required. Is there a way to replicate this here ? Because as far as I understand, if I do strict_mode = False, while low, there's a chance that llm might not pass anything, I want to avoid that. Secondly, suppose my tool calls an api, which is linked to the backend, now I want to pass an auth token to the api, which I currently do by adding it as a parameter in each tool and asking agent to provide it through its system prompt, is there a better way to do it ? Edit: I just checked there's something called |
I'm struggling to find a way to declare a function which has an optional parameter. Ideally, I want something like this. Let me know what's the way, if you don't mind!
The text was updated successfully, but these errors were encountered: