diff --git a/README.md b/README.md index 43abed8fb9f0..5b7840f3ce71 100644 --- a/README.md +++ b/README.md @@ -72,6 +72,9 @@ user_proxy = UserProxyAgent("user_proxy", code_execution_config={"work_dir": "co user_proxy.initiate_chat(assistant, message="Plot a chart of NVDA and TESLA stock price change YTD.") # This initiates an automated chat between the two agents to solve the task ``` +Multi-agent conversations: AutoGen agents can communicate with each other to solve tasks. This allows for more complex and sophisticated applications than would be possible with a single LLM. +Customization: AutoGen agents can be customized to meet the specific needs of an application. This includes the ability to choose the LLMs to use, the types of human input to allow, and the tools to employ. +Human participation: AutoGen seamlessly allows human participation. This means that humans can provide input and feedback to the agents as needed. This example can be run with ```python