You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There are a number of use cases where you'd want LLM tools calls to NOT be executed locally but instead returned to the client to trigger a client-side action. For example:
implementing "callbacks" across service boundaries
Expected Behavior
I was thinking about adding a method to FunctionCallBack that looks like this:
publicinterfaceFunctionCallback {
/** * @return whether this function is a client-executable function. Client-executable * functions are not handled locally but returned to the client with the response. * */defaultbooleanisClientFunction() {
returnfalse;
}
The isToolCall method in AbstractToolCallSupport would then check whether this method returns true for a particular tool call and return false if not. That would bypass the local tool execution and return the tool call as part of the response so that the user can then return it as needed to the frontend.
The rest of the tooling implementation could stay the same and the current work on extending function calling support would all carry over.
Current Behavior
I couldn't find a way to NOT invoke registered tools immediately. The only workaround I found to allowing tools to be passed to the frontend is to throw an exception in the function execution, raise a specific error with the payload, and then catch it. But that approach has a lot of downsides - not to mention very hacky ;-)
Context
See above. Let me know if there is a workaround. If not, I'm happy to contribute the described change or let me know if there is a better way.
The text was updated successfully, but these errors were encountered:
@mbroecheler I believe that 5017749 resolves this issue?
Please, check the OpenAiChatModelProxyToolCallsIT.java for examples how to run the function calling entirely on the client side.
Let me know if you have further questions.
@mbroecheler I believe that 5017749 resolves this issue? Please, check the OpenAiChatModelProxyToolCallsIT.java for examples how to run the function calling entirely on the client side. Let me know if you have further questions.
@tzolov is there a possibility to have some tool calls proxy and others directly handled?
There are a number of use cases where you'd want LLM tools calls to NOT be executed locally but instead returned to the client to trigger a client-side action. For example:
Expected Behavior
I was thinking about adding a method to
FunctionCallBack
that looks like this:The
isToolCall
method in AbstractToolCallSupport would then check whether this method returns true for a particular tool call and return false if not. That would bypass the local tool execution and return the tool call as part of the response so that the user can then return it as needed to the frontend.The rest of the tooling implementation could stay the same and the current work on extending function calling support would all carry over.
Current Behavior
I couldn't find a way to NOT invoke registered tools immediately. The only workaround I found to allowing tools to be passed to the frontend is to throw an exception in the function execution, raise a specific error with the payload, and then catch it. But that approach has a lot of downsides - not to mention very hacky ;-)
Context
See above. Let me know if there is a workaround. If not, I'm happy to contribute the described change or let me know if there is a better way.
The text was updated successfully, but these errors were encountered: