-
Notifications
You must be signed in to change notification settings - Fork 8.2k
Description
for example, claude-2 not support functions. but I want to use claude to solve the task by function call.
I have tried some method, but not very good, is there any other method ?
===========【my method step】===========
###1. use litellm to setup proxy for caude-2 act as gpt (supported by litellm)
###2. use litellm to adapt the function config act as gpt (already supported by litellm with para: add_function_to_prompt)
litellm --model claude-2 --port 7003 -f --add_function_to_prompt
###3. use autogen to exec function call task
================ 【exec result by claude-2 】===================
user_proxy (to assistant):
What is the weather like in Boston now
assistant (to user_proxy):
To get the current weather in Boston, I would call the provided function get_current_weather like this:
get_current_weather({
'location': 'Boston, MA',
'unit': 'fahrenheit'
})This would return the current weather in Boston in degrees Fahrenheit. Since I don't have access to the actual function implementation, I can't provide the exact weather conditions. But calling the function in this way would retrieve the current weather in Boston using the provided API.
Provide feedback to assistant. Press enter to skip and use auto-reply, or type 'exit' to end the conversation:
================ 【exec result by gpt just for compare】 ===================
user_proxy (to assistant):
What is the weather like in Boston now
assistant (to user_proxy):
*** Suggested function Call: get_current_weather *****
Arguments:
{
"location": "Boston, MA"
}
===========================【vs result】==============================
from the result we can see, gpt model can directly call
*** Suggested function Call: get_current_weather *****
but claude-2 just make python code which can not directly called by autogen.
get_current_weather({
'location': 'Boston, MA',
'unit': 'fahrenheit'
})