Skip to content

Commit

Permalink
Update Image and website url (#14)
Browse files Browse the repository at this point in the history
* update url

* update

* update

---------

Co-authored-by: Yiran Wu <32823396+kevin666aa@users.noreply.github.com>
  • Loading branch information
yiranwu0 and yiranwu0 authored Aug 26, 2024
1 parent 47cbb3a commit 282ed76
Show file tree
Hide file tree
Showing 13 changed files with 14 additions and 14 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,7 @@ pip install pyautogen

Minimal dependencies are installed without extra options. You can install extra options based on the feature you need.

<!-- For example, use the following to install the dependencies needed by the [`blendsearch`](https://autogen-ai.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function#blendsearch-economical-hyperparameter-optimization-with-blended-search-strategy) option.
<!-- For example, use the following to install the dependencies needed by the [`blendsearch`](https://microsoft.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function#blendsearch-economical-hyperparameter-optimization-with-blended-search-strategy) option.
```bash
pip install "pyautogen[blendsearch]"
``` -->
Expand Down
2 changes: 1 addition & 1 deletion notebook/agentchat_MathChat.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"source": [
"## Set your API Endpoint\n",
"\n",
"The [`config_list_from_json`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file.\n"
"The [`config_list_from_json`](https://autogen-ai.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file.\n"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion notebook/agentchat_RetrieveChat.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@
"source": [
"## Set your API Endpoint\n",
"\n",
"The [`config_list_from_json`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file.\n"
"The [`config_list_from_json`](https://autogen-ai.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file.\n"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion notebook/agentchat_agentoptimizer.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@
"source": [
"# MathUserProxy with function_call\n",
"\n",
"This agent is a customized MathUserProxy inherits from its [parent class](https://github.com/microsoft/autogen/blob/main/autogen/agentchat/contrib/math_user_proxy_agent.py).\n",
"This agent is a customized MathUserProxy inherits from its [parent class](https://github.com/autogen-ai/autogen/blob/main/autogen/agentchat/contrib/math_user_proxy_agent.py).\n",
"\n",
"It supports using both function_call and python to solve math problems.\n"
]
Expand Down
2 changes: 1 addition & 1 deletion notebook/agentchat_custom_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -210,7 +210,7 @@
"source": [
"## Set your API Endpoint\n",
"\n",
"The [`config_list_from_json`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file.\n",
"The [`config_list_from_json`](https://autogen-ai.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file.\n",
"\n",
"It first looks for an environment variable of a specified name (\"OAI_CONFIG_LIST\" in this example), which needs to be a valid json string. If that variable is not found, it looks for a json file with the same name. It filters the configs by models (you can filter by other keys as well).\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion notebook/agentchat_databricks_dbrx.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@
"source": [
"## Setup DBRX config list\n",
"\n",
"See Autogen docs for more inforation on the use of `config_list`: [LLM Configuration](https://microsoft.github.io/autogen/docs/topics/llm_configuration#why-is-it-a-list)"
"See Autogen docs for more inforation on the use of `config_list`: [LLM Configuration](https://autogen-ai.github.io/autogen/docs/topics/llm_configuration#why-is-it-a-list)"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion notebook/agentchat_function_call.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@
"source": [
"## Set your API Endpoint\n",
"\n",
"The [`config_list_from_json`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file."
"The [`config_list_from_json`](https://autogen-ai.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion notebook/agentchat_groupchat_stateflow.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@
"## A workflow for research\n",
"\n",
"<figure>\n",
" <img src=\"../website/blog/2024-02-29-StateFlow/img/sf_example_1.png\" width=\"700\"\n",
" <img src=\"https://media.githubusercontent.com/media/autogen-ai/autogen/main/website/blog/2024-02-29-StateFlow/img/sf_example_1.png\" width=\"700\"\n",
" alt=\"SF_Example_1\">\n",
" </img>\n",
"</figure>\n",
Expand Down
4 changes: 2 additions & 2 deletions notebook/agentchat_microsoft_fabric.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -404,7 +404,7 @@
"### Example 2\n",
"How to use `AssistantAgent` and `RetrieveUserProxyAgent` to do Retrieval Augmented Generation (RAG) for QA and Code Generation.\n",
"\n",
"Check out this [blog](https://microsoft.github.io/autogen/blog/2023/10/18/RetrieveChat) for more details."
"Check out this [blog](https://autogen-ai.github.io/autogen/blog/2023/10/18/RetrieveChat) for more details."
]
},
{
Expand Down Expand Up @@ -2925,7 +2925,7 @@
"### Example 3\n",
"How to use `MultimodalConversableAgent` to chat with images.\n",
"\n",
"Check out this [blog](https://microsoft.github.io/autogen/blog/2023/11/06/LMM-Agent) for more details."
"Check out this [blog](https://autogen-ai.github.io/autogen/blog/2023/11/06/LMM-Agent) for more details."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion notebook/agenteval_cq_math.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
"\n",
"- `quantify_criteria`: This function quantifies the performance of any sample task based on the criteria generated in the `generate_criteria` step in the following way: $(c_1=a_1, \\dots, c_n=a_n)$\n",
"\n",
"![AgentEval](../website/blog/2023-11-20-AgentEval/img/agenteval-CQ.png)\n",
"![AgentEval](https://media.githubusercontent.com/media/autogen-ai/autogen/main/website/blog/2023-11-20-AgentEval/img/agenteval-CQ.png)\n",
"\n",
"For more detailed explanations, please refer to the accompanying [blog post](https://autogen-ai.github.io/autogen/blog/2023/11/20/AgentEval)\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion notebook/oai_chatgpt_gpt4.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@
"source": [
"### Set your API Endpoint\n",
"\n",
"The [`config_list_openai_aoai`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_openai_aoai) function tries to create a list of Azure OpenAI endpoints and OpenAI endpoints. It assumes the api keys and api bases are stored in the corresponding environment variables or local txt files:\n",
"The [`config_list_openai_aoai`](https://autogen-ai.github.io/autogen/docs/reference/oai/openai_utils#config_list_openai_aoai) function tries to create a list of Azure OpenAI endpoints and OpenAI endpoints. It assumes the api keys and api bases are stored in the corresponding environment variables or local txt files:\n",
"\n",
"- OpenAI API key: os.environ[\"OPENAI_API_KEY\"] or `openai_api_key_file=\"key_openai.txt\"`.\n",
"- Azure OpenAI API key: os.environ[\"AZURE_OPENAI_API_KEY\"] or `aoai_api_key_file=\"key_aoai.txt\"`. Multiple keys can be stored, one per line.\n",
Expand Down
2 changes: 1 addition & 1 deletion website/blog/2023-04-21-LLM-tuning-math/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ An example notebook to run these experiments can be found at: https://github.com

While gpt-3.5-turbo demonstrates competitive accuracy with voted answers in relatively easy algebra problems under the same inference budget, gpt-4 is a better choice for the most difficult problems. In general, through parameter tuning and model selection, we can identify the opportunity to save the expensive model for more challenging tasks, and improve the overall effectiveness of a budget-constrained system.

There are many other alternative ways of solving math problems, which we have not covered in this blog post. When there are choices beyond the inference parameters, they can be generally tuned via [`flaml.tune`](https://autogen-ai.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function).
There are many other alternative ways of solving math problems, which we have not covered in this blog post. When there are choices beyond the inference parameters, they can be generally tuned via [`flaml.tune`](https://microsoft.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function).

The need for model selection, parameter tuning and cost saving is not specific to the math problems. The [Auto-GPT](https://github.com/Significant-Gravitas/Auto-GPT) project is an example where high cost can easily prevent a generic complex task to be accomplished as it needs many LLM inference calls.

Expand Down
2 changes: 1 addition & 1 deletion website/docs/Migration-Guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ autogen.runtime_logging.stop()
```
Checkout [Logging documentation](https://autogen-ai.github.io/autogen/docs/Use-Cases/enhanced_inference#logging) and [Logging example notebook](https://github.com/autogen-ai/autogen/blob/main/notebook/agentchat_logging.ipynb) to learn more.

Inference parameter tuning can be done via [`flaml.tune`](https://autogen-ai.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function).
Inference parameter tuning can be done via [`flaml.tune`](https://microsoft.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function).
- `seed` in autogen is renamed into `cache_seed` to accommodate the newly added `seed` param in openai chat completion api. `use_cache` is removed as a kwarg in `OpenAIWrapper.create()` for being automatically decided by `cache_seed`: int | None. The difference between autogen's `cache_seed` and openai's `seed` is that:
- autogen uses local disk cache to guarantee the exactly same output is produced for the same input and when cache is hit, no openai api call will be made.
- openai's `seed` is a best-effort deterministic sampling with no guarantee of determinism. When using openai's `seed` with `cache_seed` set to None, even for the same input, an openai api call will be made and there is no guarantee for getting exactly the same output.

0 comments on commit 282ed76

Please sign in to comment.