-
Notifications
You must be signed in to change notification settings - Fork 835
Description
Description
There seems to be a difference in serialization of function arguments between SemanticKernel's KernelFunctions and the AIFunction as an AIKernelFunction.
When a complex item like
public record TodoListItems(string[] Items);
is used as a function argument, it tries to call the function with a json string representation of that object.
While the jsonschema seems to be correct for both:
{
"type": "object",
"properties": {
"values": {
"type": "object",
"properties": {
"items": {
"type": "array",
"items": {
"type": "string"
}
}
},
"required": [
"items"
]
},
"scope": {
"type": "string"
}
},
"required": [
"values",
"scope"
]
}
Reproduction Steps
Let SemanticKernel via AzureOpenAI call these two methods:
// this one works
KernelFunctionFactory.CreateFromMethod((TodoListItems items, string scope)=> items.Items.Reverse().ToArray(),new(){
FunctionName = "add_todos"
}),
// this one fails
AIFunctionFactory.Create((TodoListItems values, string scope) => "success", new()
{
Name = "add_special_todos",
Description = "add todos, but only if the user explicitly asks for special todos"
}).AsKernelFunction()
If you ask to "add some random todos", it will call the delegate.
If you ask to "add some special random todos", it will throw an error.
To debug the AIFunction, i wrapped it:
public class DebugAiFunction(AIFunction inner) : AIFunction
{
public override string Name => inner.Name;
public override IReadOnlyDictionary<string, object?> AdditionalProperties => inner.AdditionalProperties;
public override string Description => inner.Description;
public override JsonElement JsonSchema => inner.JsonSchema;
public override JsonSerializerOptions JsonSerializerOptions => inner.JsonSerializerOptions;
public override JsonElement? ReturnJsonSchema => inner.ReturnJsonSchema;
public override MethodInfo? UnderlyingMethod => inner.UnderlyingMethod;
protected override ValueTask<object?> InvokeCoreAsync(AIFunctionArguments arguments, CancellationToken cancellationToken)
{
return inner.InvokeAsync(arguments, cancellationToken);
}
}
But the SemanticKernel's KernelFunction is internal, so that didn't work there so, I made a Matryoshka:
new DebugAiFunction(KernelFunctionFactory.CreateFromMethod((TodoListItems items, string scope)=> items.Items.Reverse().ToArray(),new(){
FunctionName = "add_todos"
}).AsAIFunction()).AsKernelFunction(),
Both debug functions see exactly the same argument (as a literal string):
{"Items": ["Buy groceries", "Call mom", "Finish reading book"]}
Expected behavior
The same behavior in a KernelFunction and an AIFunction.
Actual behavior
System.ArgumentException: 'Object of type 'System.String' cannot be converted to type
Regression?
No response
Known Workarounds
Writing a strongly typed plugin in SemanticKernel to wrap the MCP client, but then ... there goes the flexibility of the MCP...
Configuration
<TargetFramework>net9.0</TargetFramework>
<PackageReference Include="ModelContextProtocol-SemanticKernel" Version="0.3.0-preview-01" />
<PackageReference Include="Microsoft.SemanticKernel.Connectors.AzureOpenAI" Version="1.58.0" />
Other information
The whole pipeline that went wrong here was something along the likes of:
With the MCP server expecting an array of arguments.