Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] How to use tool calling with Gemini and DeepSeek #1391

Open
2 of 3 tasks
Rahul-Sindhu opened this issue Jan 3, 2025 · 1 comment
Open
2 of 3 tasks

[Question] How to use tool calling with Gemini and DeepSeek #1391

Rahul-Sindhu opened this issue Jan 3, 2025 · 1 comment
Labels
question Further information is requested

Comments

@Rahul-Sindhu
Copy link

Rahul-Sindhu commented Jan 3, 2025

Required prerequisites

Questions

Simple examples of tool use with Gemini and DeepSeek fail:

In the example cookbook agents_with_tools if I use the below 2 model configurations, the agent.step call will fail:

from camel.configs import ChatGPTConfig, DeepSeekConfig, GeminiConfig

tools_list = [
    *MathToolkit().get_tools(),
    *SearchToolkit().get_tools(),
]

#model for gemini 
model = ModelFactory.create(
    model_platform=ModelPlatformType.GEMINI,
    model_type='gemini-1.5-flash',
    model_config_dict=GeminiConfig(temperature=0.2, max_tokens=8190).as_dict(),
)

#model for deepseek
model = ModelFactory.create(
        model_platform=ModelPlatformType.DEEPSEEK,
        model_type=ModelType.DEEPSEEK_CHAT,
        model_config_dict=DeepSeekConfig(temperature=1.3, tools= tools_list).as_dict(),
        api_key="sk-xx",
        url="https://api.deepseek.com",
    )

#Set message for the assistant
assistant_sys_msg =  """You are a helpful assistant to do search task."""


#Set the agent
agent = ChatAgent(
    assistant_sys_msg,
    model=model,
    tools=tools_list
)

agent.step('a message that calls tools')

for deepseek I get below error:
File "/opt/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1061, in _request
raise self._make_status_error_from_response(err.response) from None
openai.UnprocessableEntityError: Failed to deserialize the JSON body into the target type: messages[4].role: unknown variant function, expected one of system, user, assistant, tool at line 1 column 19571

for gemini I get below error:
File "/opt/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1061, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - [{'error': {'code': 400, 'message': 'Request contains an invalid argument.', 'status': 'INVALID_ARGUMENT'}}]

@Rahul-Sindhu Rahul-Sindhu added the question Further information is requested label Jan 3, 2025
@Wendong-Fan
Copy link
Member

Hey @Rahul-Sindhu , thanks for raising this question and sorry for the late reply, for gemini and deepseek, the tool calling support haven't been added into camel, one related issue here: #1238, but we would update support tool calling for this 2 models in this week and will let you know once it's done

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants