Skip to content

Commit

Permalink
Merge branch 'main' into selectspeakertransforms
Browse files Browse the repository at this point in the history
  • Loading branch information
sonichi authored Sep 13, 2024
2 parents b486d45 + af7ea9a commit 01cae91
Show file tree
Hide file tree
Showing 331 changed files with 641 additions and 23,579 deletions.
11 changes: 1 addition & 10 deletions .devcontainer/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,15 +29,6 @@ These configurations can be used with Codespaces and locally.
- **Before using**: We highly encourage all potential contributors to read the [AutoGen Contributing](https://autogen-ai.github.io/autogen/docs/Contribute) page prior to submitting any pull requests.


### studio

- **Purpose**: Tailored for AutoGen project developers, this Dockerfile, i.e., `./studio/Dockerfile`, includes tools and configurations aiding in development and contribution.
- **Usage**: Recommended for developers who are contributing to the AutoGen project.
- **Building the Image**: Run `docker build -f studio/Dockerfile -t autogen_studio_img .`.
- **Using with Codespaces**: `Code > Codespaces > Click on ...> New with options > Choose "studio" as devcontainer configuration`.
- **Before using**: We highly encourage all potential contributors to read the [AutoGen Contributing](https://autogen-ai.github.io/autogen/docs/Contribute) page prior to submitting any pull requests.


## Customizing Dockerfiles

Feel free to modify these Dockerfiles for your specific project needs. Here are some common customizations:
Expand All @@ -49,7 +40,7 @@ Feel free to modify these Dockerfiles for your specific project needs. Here are
- **Setting Environment Variables**: Add environment variables using the `ENV` command for any application-specific configurations. We have prestaged the line needed to inject your OpenAI_key into the docker environment as a environmental variable. Others can be staged in the same way. Just uncomment the line.
`# ENV OPENAI_API_KEY="{OpenAI-API-Key}"` to `ENV OPENAI_API_KEY="{OpenAI-API-Key}"`
- **Need a less "Advanced" Autogen build**: If the `./full/Dockerfile` is to much but you need more than advanced then update this line in the Dockerfile file.
`RUN pip install pyautogen[teachable,lmm,retrievechat,mathchat,blendsearch] autogenra` to install just what you need. `RUN pip install pyautogen[retrievechat,blendsearch] autogenra`
`RUN pip install autogen[teachable,lmm,retrievechat,mathchat,blendsearch] autogenra` to install just what you need. `RUN pip install autogen[retrievechat,blendsearch] autogenra`
- **Can't Dev without your favorite CLI tool**: if you need particular OS tools to be installed in your Docker container you can add those packages here right after the sudo for the `./base/Dockerfile` and `./full/Dockerfile` files. In the example below we are installing net-tools and vim to the environment.

```code
Expand Down
2 changes: 1 addition & 1 deletion .devcontainer/full/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ WORKDIR /home/autogen-ai

# Install Python packages
RUN pip install --upgrade pip
RUN pip install pyautogen[teachable,lmm,retrievechat,mathchat,blendsearch] autogenra
RUN pip install autogen[teachable,lmm,retrievechat,mathchat,blendsearch] autogenra
RUN pip install numpy pandas matplotlib seaborn scikit-learn requests urllib3 nltk pillow pytest beautifulsoup4

# Expose port
Expand Down
33 changes: 0 additions & 33 deletions .devcontainer/studio/Dockerfile

This file was deleted.

21 changes: 0 additions & 21 deletions .devcontainer/studio/devcontainer.json

This file was deleted.

49 changes: 0 additions & 49 deletions .github/workflows/samples-tools-tests.yml

This file was deleted.

1 change: 0 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -187,6 +187,5 @@ local_cache


notebook/result.png
samples/apps/autogen-studio/autogenstudio/models/test/

notebook/coding
29 changes: 23 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,31 @@
<a name="readme-top"></a>

[![PyPI version](https://badge.fury.io/py/pyautogen.svg)](https://badge.fury.io/py/pyautogen)
[![PyPI version](https://badge.fury.io/py/autogen.svg)](https://badge.fury.io/py/autogen)
[![Build](https://github.com/autogen-ai/autogen/actions/workflows/python-package.yml/badge.svg)](https://github.com/autogen-ai/autogen/actions/workflows/python-package.yml)
![Python Version](https://img.shields.io/badge/3.8%20%7C%203.9%20%7C%203.10%20%7C%203.11%20%7C%203.12-blue)
[![Downloads](https://static.pepy.tech/badge/pyautogen/month)](https://pepy.tech/project/pyautogen)
[![Discord](https://img.shields.io/discord/1153072414184452236?logo=discord&style=flat)](https://discord.gg/pAbnFJrkgZ)
[![Twitter](https://img.shields.io/twitter/url/https/twitter.com/cloudposse.svg?style=social&label=Follow%20%40Chi_Wang_)](https://x.com/Chi_Wang_)

[![NuGet version](https://badge.fury.io/nu/AutoGen.Core.svg)](https://badge.fury.io/nu/AutoGen.Core)

# AutoGen
# [AutoGen](https://github.com/autogen-ai/autogen)

[📚 Cite paper](#related-papers).
<!-- <p align="center">
<img src="https://github.com/autogen-ai/autogen/blob/main/website/static/img/flaml.svg" width=200>
<br>
</p> -->
:fire: :tada: Sep 06, 2024: AutoGen now available as `autogen` on PyPI! We're excited to announce a more convenient package name for AutoGen: Starting with version 0.3.0, you can now install AutoGen using:
```
pip install autogen
```
We extend our sincere gratitude to the original owner of `autogen` pypi package for generously transferring it to us.

**Note:** The previous package name `pyautogen` will remain valid for a transitional period. However, we encourage users to switch to the new, more intuitive `autogen` package name, as `pyautogen` will eventually be deprecated.

📄 **License Change:**
With this new release and package name, we are officially switching to the Apache 2.0 license. This enhances our commitment to open-source collaboration while providing additional protections for contributors and users alike.

:fire: Aug 24, 2024: A new organization [autogen-ai](https://github.com/autogen-ai) is created to host the development of AutoGen and related projects with open governance. We invite collaborators from all organizations and individuals.

:tada: May 29, 2024: DeepLearning.ai launched a new short course [AI Agentic Design Patterns with AutoGen](https://www.deeplearning.ai/short-courses/ai-agentic-design-patterns-with-autogen), made in collaboration with Microsoft and Penn State University, and taught by AutoGen creators [Chi Wang](https://github.com/sonichi) and [Qingyun Wu](https://github.com/qingyun-wu).
Expand Down Expand Up @@ -117,14 +128,14 @@ Find detailed instructions for users [here](https://autogen-ai.github.io/autogen
AutoGen requires **Python version >= 3.8, < 3.13**. It can be installed from pip:

```bash
pip install pyautogen
pip install autogen
```

Minimal dependencies are installed without extra options. You can install extra options based on the feature you need.

<!-- For example, use the following to install the dependencies needed by the [`blendsearch`](https://microsoft.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function#blendsearch-economical-hyperparameter-optimization-with-blended-search-strategy) option.
```bash
pip install "pyautogen[blendsearch]"
pip install "autogen[blendsearch]"
``` -->

Find more options in [Installation](https://autogen-ai.github.io/autogen/docs/Installation#option-2-install-autogen-locally-using-virtual-environment).
Expand Down Expand Up @@ -177,7 +188,7 @@ After the repo is cloned.
The figure below shows an example conversation flow with AutoGen.
![Agent Chat Example](https://github.com/autogen-ai/autogen/blob/main/website/static/img/chat_example.png)

Alternatively, the [sample code](https://github.com/autogen-ai/autogen/blob/main/samples/simple_chat.py) here allows a user to chat with an AutoGen agent in ChatGPT style.
Alternatively, the [sample code](https://github.com/autogen-ai/build-with-autogen/blob/main/samples/simple_chat.py) here allows a user to chat with an AutoGen agent in ChatGPT style.
Please find more [code examples](https://autogen-ai.github.io/autogen/docs/Examples#automated-multi-agent-chat) for this feature.

<p align="right" style="font-size: 14px; color: #555; margin-top: 20px;">
Expand Down Expand Up @@ -233,6 +244,12 @@ In addition, you can find:
</a>
</p>

## CookBook

Explore detailed implementations with sample code and applications to help you get started with AutoGen.
[Cookbook](https://github.com/autogen-ai/build-with-autogen)


## Related Papers

[AutoGen](https://arxiv.org/abs/2308.08155)
Expand Down
2 changes: 2 additions & 0 deletions TRANSPARENCY_FAQS.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,8 @@ While AutoGen automates LLM workflows, decisions about how to use specific LLM o
- Current version of AutoGen was evaluated on six applications to illustrate its potential in simplifying the development of high-performance multi-agent applications. These applications are selected based on their real-world relevance, problem difficulty and problem solving capabilities enabled by AutoGen, and innovative potential.
- These applications involve using AutoGen to solve math problems, question answering, decision making in text world environments, supply chain optimization, etc. For each of these domains AutoGen was evaluated on various success based metrics (i.e., how often the AutoGen based implementation solved the task). And, in some cases, AutoGen based approach was also evaluated on implementation efficiency (e.g., to track reductions in developer effort to build). More details can be found at: https://aka.ms/AutoGen/TechReport
- The team has conducted tests where a “red” agent attempts to get the default AutoGen assistant to break from its alignment and guardrails. The team has observed that out of 70 attempts to break guardrails, only 1 was successful in producing text that would have been flagged as problematic by Azure OpenAI filters. The team has not observed any evidence that AutoGen (or GPT models as hosted by OpenAI or Azure) can produce novel code exploits or jailbreak prompts, since direct prompts to “be a hacker”, “write exploits”, or “produce a phishing email” are refused by existing filters.
- We also evaluated [a team of AutoGen agents](https://github.com/microsoft/autogen/tree/gaia_multiagent_v01_march_1st/samples/tools/autogenbench/scenarios/GAIA/Templates/Orchestrator) on the [GAIA benchmarks](https://arxiv.org/abs/2311.12983), and got [SOTA results](https://huggingface.co/spaces/gaia-benchmark/leaderboard) as of
March 1, 2024.

## What are the limitations of AutoGen? How can users minimize the impact of AutoGen’s limitations when using the system?
AutoGen relies on existing LLMs. Experimenting with AutoGen would retain common limitations of large language models; including:
Expand Down
2 changes: 1 addition & 1 deletion autogen/agentchat/contrib/agent_builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ class AgentBuilder:
"""

AGENT_NAME_PROMPT = """# Your task
Suggest no more then {max_agents} experts with their name according to the following user requirement.
Suggest no more than {max_agents} experts with their name according to the following user requirement.
## User requirement
{task}
Expand Down
4 changes: 1 addition & 3 deletions autogen/agentchat/contrib/capabilities/text_compressors.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,7 @@
try:
import llmlingua
except ImportError:
IMPORT_ERROR = ImportError(
"LLMLingua is not installed. Please install it with `pip install pyautogen[long-context]`"
)
IMPORT_ERROR = ImportError("LLMLingua is not installed. Please install it with `pip install autogen[long-context]`")
PromptCompressor = object
else:
from llmlingua import PromptCompressor
Expand Down
2 changes: 1 addition & 1 deletion autogen/agentchat/contrib/retrieve_user_proxy_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
try:
import chromadb
except ImportError as e:
raise ImportError(f"{e}. You can try `pip install pyautogen[retrievechat]`, or install `chromadb` manually.")
raise ImportError(f"{e}. You can try `pip install autogen[retrievechat]`, or install `chromadb` manually.")
from autogen.agentchat import UserProxyAgent
from autogen.agentchat.agent import Agent
from autogen.agentchat.contrib.vectordb.base import Document, QueryResults, VectorDB, VectorDBFactory
Expand Down
8 changes: 4 additions & 4 deletions autogen/coding/func_with_reqs.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
from dataclasses import dataclass, field
from importlib.abc import SourceLoader
from textwrap import dedent, indent
from typing import Any, Callable, Generic, List, TypeVar, Union
from typing import Any, Callable, Generic, List, Set, TypeVar, Union

from typing_extensions import ParamSpec

Expand Down Expand Up @@ -165,12 +165,12 @@ def _build_python_functions_file(
funcs: List[Union[FunctionWithRequirements[Any, P], Callable[..., Any], FunctionWithRequirementsStr]]
) -> str:
# First collect all global imports
global_imports = set()
global_imports: Set[str] = set()
for func in funcs:
if isinstance(func, (FunctionWithRequirements, FunctionWithRequirementsStr)):
global_imports.update(func.global_imports)
global_imports.update(map(_import_to_str, func.global_imports))

content = "\n".join(map(_import_to_str, global_imports)) + "\n\n"
content = "\n".join(global_imports) + "\n\n"

for func in funcs:
content += _to_code(func) + "\n\n"
Expand Down
15 changes: 13 additions & 2 deletions autogen/coding/local_commandline_code_executor.py
Original file line number Diff line number Diff line change
Expand Up @@ -227,7 +227,12 @@ def _setup_functions(self) -> None:
cmd = [py_executable, "-m", "pip", "install"] + required_packages
try:
result = subprocess.run(
cmd, cwd=self._work_dir, capture_output=True, text=True, timeout=float(self._timeout)
cmd,
cwd=self._work_dir,
capture_output=True,
text=True,
timeout=float(self._timeout),
encoding="utf-8",
)
except subprocess.TimeoutExpired as e:
raise ValueError("Pip install timed out") from e
Expand Down Expand Up @@ -309,7 +314,13 @@ def _execute_code_dont_check_setup(self, code_blocks: List[CodeBlock]) -> Comman

try:
result = subprocess.run(
cmd, cwd=self._work_dir, capture_output=True, text=True, timeout=float(self._timeout), env=env
cmd,
cwd=self._work_dir,
capture_output=True,
text=True,
timeout=float(self._timeout),
env=env,
encoding="utf-8",
)
except subprocess.TimeoutExpired:
logs_all += "\n" + TIMEOUT_MSG
Expand Down
8 changes: 1 addition & 7 deletions autogen/oai/bedrock.py
Original file line number Diff line number Diff line change
Expand Up @@ -204,13 +204,7 @@ def create(self, params):
if len(tool_config["tools"]) > 0:
request_args["toolConfig"] = tool_config

try:
response = self.bedrock_runtime.converse(
**request_args,
)
except Exception as e:
raise RuntimeError(f"Failed to get response from Bedrock: {e}")

response = self.bedrock_runtime.converse(**request_args)
if response is None:
raise RuntimeError(f"Failed to get response from Bedrock after retrying {self._retries} times.")

Expand Down
Loading

0 comments on commit 01cae91

Please sign in to comment.