-
Notifications
You must be signed in to change notification settings - Fork 932
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Passing Contextual Parameters in Function Calling #864
Comments
@miltonhit wouldn't it work to have the Can you elaborate more on the use cases where this functionality is important to have? |
Yes, it would be... However, currently these parameters are populated by an LLM. This isn't secure. var chatResponseFlux = chatClient.prompt()
.user(message.getContent())
.advisors(a -> a
.param('userId', userId)
)
.advisors(new ChatMemoryAdvisor(... ))
.stream().chatResponse(); |
I have also encountered a similar problem. I hope to obtain parameters in the context of the request or through the parameters of the function, instead of obtaining the session ID in the LLM response text, because the session ID in the LLM response text may be empty |
This commit adds support for tool context in various chat options classes across different AI model implementations and enhances function calling capabilities. The tool context allows passing additional contextual information to function callbacks. - Add toolContext field to chat options classes - Update builder classes to support setting toolContext - Enhance FunctionCallback interface to support context-aware function calls - Update AbstractFunctionCallback to implement BiFunction instead of Function - Modify FunctionCallbackWrapper to support both Function and BiFunction and to use the new SchemaType location - Add support for BiFunction in TypeResolverHelper - Update ChatClient interface and DefaultChatClient implementation to support new function calling methods with Function, BiFunction and FunctionCallback arguments - Refactor AbstractToolCallSupport to pass tool context to function execution - Update all affected <Model>ChatOptions with tool context support - Simplify OpenAiChatClientMultipleFunctionCallsIT test - Add tests for function calling with tool context - Add new test cases for function callbacks with context in various integration tests - Modify existing tests to incorporate new context-aware function calling capabilities Resolves spring-projects#864, spring-projects#1303, spring-projects#991
Expected Behavior
I need to get some "session" information inside a
Function<A, B>
(function call) bean, for example:Then
Today this behavior is not possible. A possible workaround is to input this param to LLM context.
But this solution is very insecure, beacase the user can propmt some injection, for example:
The text was updated successfully, but these errors were encountered: