-
Notifications
You must be signed in to change notification settings - Fork 8.1k
Description
What happened?
Describe the bug
I am referring to https://microsoft.github.io/autogen/stable/user-guide/agentchat-user-guide/tutorial/models.html#gemini-experimental but the documentation seems incomplete
To Reproduce
from autogen_ext.models.openai import OpenAIChatCompletionClient
model_client = OpenAIChatCompletionClient(
... model="gemini-2.0-flash-lite",
... api_key='<>',
... )
..._model_info.py", line 314, in get_info
raise ValueError("model_info is required when model name is not a valid OpenAI model")
ValueError: model_info is required when model name is not a valid OpenAI model
How do I get it to work with Gemini.
I am using v0.5.1
Which packages was the bug in?
Python Extensions (autogen-ext)
AutoGen library version.
Python 0.5.1
Other library version.
No response
Model used
gemini-2.0-flash-lite
Model provider
Google Gemini
Other model provider
No response
Python version
3.12
.NET version
None
Operating system
None
Metadata
Metadata
Assignees
Labels
No labels