Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tool context support to chat options and enhance function calling #1458

Closed
wants to merge 1 commit into from

Conversation

tzolov
Copy link
Contributor

@tzolov tzolov commented Oct 4, 2024

This commit adds support for tool context in various chat options classes across different AI model implementations and enhances function calling capabilities.

The tool context allows passing additional contextual information to function callbacks.

  • Add toolContext field to chat options classes
  • Update builder classes to support setting toolContext
  • Enhance FunctionCallback interface to support context-aware function calls
  • Update AbstractFunctionCallback to implement BiFunction instead of Function
  • Modify FunctionCallbackWrapper to support both Function and BiFunction and to use the new SchemaType location
  • Add support for BiFunction in TypeResolverHelper
  • Update ChatClient interface and DefaultChatClient implementation to support new function calling methods with Function, BiFunction and FunctionCallback arguments
  • Refactor AbstractToolCallSupport to pass tool context to function execution
  • Update all affected ChatOptions with tool context support
  • Simplify OpenAiChatClientMultipleFunctionCallsIT test
  • Add tests for function calling with tool context
  • Add new test cases for function callbacks with context in various integration tests
  • Modify existing tests to incorporate new context-aware function calling capabilities

Resolves #864, #1303, #991

  This commit adds support for tool context in various chat options classes across
  different AI model implementations and enhances function calling capabilities.

  The tool context allows passing additional contextual information to function callbacks.

 - Add toolContext field to chat options classes
 - Update builder classes to support setting toolContext
 - Enhance FunctionCallback interface to support context-aware function calls
 - Update AbstractFunctionCallback to implement BiFunction instead of Function
 - Modify FunctionCallbackWrapper to support both Function and BiFunction and
   to use the new SchemaType location
 - Add support for BiFunction in TypeResolverHelper
 - Update ChatClient interface and DefaultChatClient implementation to support
   new function calling methods with Function, BiFunction and FunctionCallback arguments
 - Refactor AbstractToolCallSupport to pass tool context to function execution
 - Update all affected <Model>ChatOptions with tool context support
 - Simplify OpenAiChatClientMultipleFunctionCallsIT test
 - Add tests for function calling with tool context
 - Add new test cases for function callbacks with context in various integration tests
 - Modify existing tests to incorporate new context-aware function calling capabilities

 Resolves spring-projects#864, spring-projects#1303, spring-projects#991
@tzolov tzolov added this to the 1.0.0-M3 milestone Oct 4, 2024
@markpollack
Copy link
Member

I added this to openai-chat-functions.adoc for now

==== How to Use Tool Context

You can set the tool context when building your chat options and use a BiFunction for your callback:

[source,java]
----
BiFunction<MockWeatherService.Request, Map<String, Object>, MockWeatherService.Response> weatherFunction = 
    (request, toolContext) -> {
        String sessionId = (String) toolContext.get("sessionId");
        String userId = (String) toolContext.get("userId");
        
        // Use sessionId and userId in your function logic
        double temperature = 0;
        if (request.location().contains("Paris")) {
            temperature = 15;
        }
        else if (request.location().contains("Tokyo")) {
            temperature = 10;
        }
        else if (request.location().contains("San Francisco")) {
            temperature = 30;
        }

        return new MockWeatherService.Response(temperature, 15, 20, 2, 53, 45, MockWeatherService.Unit.C);
    };

OpenAiChatOptions options = OpenAiChatOptions.builder()
    .withModel(OpenAiApi.ChatModel.GPT_4_O.getValue())
    .withFunctionCallbacks(List.of(FunctionCallbackWrapper.builder(weatherFunction)
        .withName("getCurrentWeather")
        .withDescription("Get the weather in location")
        .build()))
    .withToolContext(Map.of("sessionId", "123", "userId", "user456"))
    .build();
----

In this example, the `weatherFunction` is defined as a BiFunction that takes both the request and the tool context as parameters. This allows you to access the context directly within the function logic.

You can then use these options when making a call to the chat model:

[source,java]
----
UserMessage userMessage = new UserMessage("What's the weather like in San Francisco, Tokyo, and Paris?");
ChatResponse response = chatModel.call(new Prompt(List.of(userMessage), options));
----

@markpollack
Copy link
Member

merged in 9c10a08

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support for Passing Contextual Parameters in Function Calling
2 participants