Support function call for llama-2 #174
Replies: 3 comments 3 replies
-
I believe llama.cpp should work with this as is, so as long as you have a GGUF model the currently release LLamaSharp should be able to use it. Reading the model file, it was trained to produce JSON for commands when run, which would you need to handle in the output. If I understand it right This won't "run" the functions (Ex. It is not going to call Bing if you setup a function to do so, it will give you a JSON response with that request to call a function that you wrote, for you to handle and give back those results.) This will work with the current Llama.cpp/LLamaSharp. LLamaSharp could potentitally handle those JSON requests and do the work for the user (provided they set up the code for the functions ahead of time) but that's what semantic-kernel already does, and we already added an implmentation for it, and that should work with most Llama2 models already available. |
Beta Was this translation helpful? Give feedback.
-
This is how we use ChatGPT function call: var response = await client.GetChatCompletionsAsync(deploymentModel, chatCompletionsOptions);
var choice = response.Value.Choices[0];
var message = choice.Message;
_logger.LogInformation($"Token Usage: {response.Value.Usage.PromptTokens} prompt + {response.Value.Usage.CompletionTokens} completion = {response.Value.Usage.TotalTokens} total");
if (choice.FinishReason == CompletionsFinishReason.FunctionCall)
{
_logger.LogInformation($"[{agent.Name}]: {message.FunctionCall.Name} => {message.FunctionCall.Arguments}");
var funcContextIn = new RoleDialogModel(AgentRole.Function, message.Content)
{
CurrentAgentId = agent.Id,
FunctionName = message.FunctionCall.Name,
FunctionArgs = message.FunctionCall.Arguments
};
// Somethings LLM will generate a function name with agent name.
if (!string.IsNullOrEmpty(funcContextIn.FunctionName))
{
funcContextIn.FunctionName = funcContextIn.FunctionName.Split('.').Last();
}
// Execute functions
await onFunctionExecuting(funcContextIn);
}
else
{
_logger.LogInformation($"[{agent.Name}] {message.Role}: {message.Content}");
var msg = new RoleDialogModel(AgentRole.Assistant, message.Content)
{
CurrentAgentId= agent.Id
};
// Text response received
await onMessageReceived(msg);
} This is how we use LLamaSharp: string totalResponse = "";
var prompt = agent.Instruction + content;
var convSetting = _services.GetRequiredService<ConversationSetting>();
if (convSetting.ShowVerboseLog)
{
_logger.LogInformation(prompt);
}
foreach (var response in executor.Infer(prompt, inferenceParams))
{
Console.Write(response);
totalResponse += response;
}
foreach (var anti in inferenceParams.AntiPrompts)
{
totalResponse = totalResponse.Replace(anti, "").Trim();
}
_logger.LogInformation($"[{agent.Name}] {AgentRole.Assistant}: {totalResponse}");
var msg = new RoleDialogModel(AgentRole.Assistant, totalResponse)
{
CurrentAgentId = agent.Id
};
// There is no function call supported yet, need to implement this in LLamaSharp library,
// Otherwise we need to parse the content and simulate the function call.
// Text response received
await onMessageReceived(msg); As you can see, there is no function call supported yet, if there is no function implementation in LLamaSharp library, In specific charge of this feature, we can add something similar with |
Beta Was this translation helpful? Give feedback.
-
I'd love to use function calls! I think if we're going to support them we split it into two parts:
That way if another function call syntax became popular LLamaSharp would have all the necessary "hooks" implemented to support it right away. |
Beta Was this translation helpful? Give feedback.
-
Is it possible to add function call feature for LLamaSharp?
fLlama 2 - Function Calling Llama 2
Beta Was this translation helpful? Give feedback.
All reactions