Skip to content

Commit

Permalink
Fix Build Doc CI issues (#3783)
Browse files Browse the repository at this point in the history
# Description

Please add an informative description that covers that changes made by
the pull request and link all relevant issues.

# All Promptflow Contribution checklist:
- [ ] **The pull request does not introduce [breaking changes].**
- [ ] **CHANGELOG is updated for new features, bug fixes or other
significant changes.**
- [ ] **I have read the [contribution
guidelines](https://github.com/microsoft/promptflow/blob/main/CONTRIBUTING.md).**
- [ ] **I confirm that all new dependencies are compatible with the MIT
license.**
- [ ] **Create an issue and link to the pull request to get dedicated
review from promptflow team. Learn more: [suggested
workflow](../CONTRIBUTING.md#suggested-workflow).**

## General Guidelines and Best Practices
- [ ] Title of the pull request is clear and informative.
- [ ] There are a small number of commits, each of which have an
informative message. This means that previously merged commits do not
appear in the history of the PR. For more information on cleaning up the
commits in your PR, [see this
page](https://github.com/Azure/azure-powershell/blob/master/documentation/development-docs/cleaning-up-commits.md).

### Testing Guidelines
- [ ] Pull request includes test coverage for the included changes.
  • Loading branch information
ninghu authored Oct 31, 2024
1 parent f08e576 commit 9345ad3
Show file tree
Hide file tree
Showing 7 changed files with 8 additions and 145 deletions.
103 changes: 0 additions & 103 deletions docs/cloud/azureai/tracing/index.md

This file was deleted.

32 changes: 0 additions & 32 deletions docs/cloud/azureai/tracing/run_tracking.md

This file was deleted.

2 changes: 1 addition & 1 deletion docs/concepts/concept-connections.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Prompt flow provides a variety of pre-built connections, including Azure OpenAI,
| [OpenAI](https://openai.com/) | LLM or Python |
| [Cognitive Search](https://azure.microsoft.com/products/search) | Vector DB Lookup or Python |
| [Serp](https://serpapi.com/) | Serp API or Python |
| [Serverless](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview#deploy-models-as-serverless-apis) | LLM or Python |
| [Serverless](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview) | LLM or Python |
| Custom | Python |

By leveraging connections in prompt flow, you can easily establish and manage connections to external APIs and data sources, facilitating efficient data exchange and interaction within their AI applications.
Expand Down
6 changes: 3 additions & 3 deletions docs/reference/tools-reference/llm-tool.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# LLM
# LLM

## Introduction
Prompt flow LLM tool enables you to leverage widely used large language models like [OpenAI](https://platform.openai.com/), [Azure OpenAI (AOAI)](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/overview), and models in [Azure AI Studio model catalog](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/model-catalog) for natural language processing.
Prompt flow LLM tool enables you to leverage widely used large language models like [OpenAI](https://platform.openai.com/), [Azure OpenAI (AOAI)](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/overview), and models in [Azure AI Studio model catalog](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/model-catalog) for natural language processing.
> [!NOTE]
> The previous version of the LLM tool is now being deprecated. Please upgrade to latest [promptflow-tools](https://pypi.org/project/promptflow-tools/) package to consume new llm tools.
Expand All @@ -25,7 +25,7 @@ Create OpenAI resources, Azure OpenAI resources or MaaS deployment with the LLM

- **MaaS deployment**

Create MaaS deployment for models in Azure AI Studio model catalog with [instruction](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview#deploy-models-as-serverless-apis)
Create MaaS deployment for models in Azure AI Studio model catalog with [instruction](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview)

You can create serverless connection to use this MaaS deployment.

Expand Down
4 changes: 2 additions & 2 deletions examples/tutorials/run-flow-with-pipeline/pipeline.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"When using the `load_component` function and the flow YAML specification, your flow is automatically transformed into a __[parallel component](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-parallel-job-in-pipeline?view=azureml-api-2&tabs=cliv2#why-are-parallel-jobs-needed)__. This parallel component is designed for large-scale, offline, parallelized processing with efficiency and resilience. Here are some key features of this auto-converted component:\n",
"When using the `load_component` function and the flow YAML specification, your flow is automatically transformed into a __[parallel component](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-parallel-job-in-pipeline?view=azureml-api-2&tabs=cliv2)__. This parallel component is designed for large-scale, offline, parallelized processing with efficiency and resilience. Here are some key features of this auto-converted component:\n",
"\n",
" - Pre-defined input and output ports:\n",
"\n",
Expand Down Expand Up @@ -176,7 +176,7 @@
"## 3.1 Declare input and output\n",
"To supply your pipeline with data, you need to declare an input using the `path`, `type`, and `mode` properties. Please note: `mount` is the default and suggested mode for your file or folder data input.\n",
"\n",
"Declaring the pipeline output is optional. However, if you require a customized output path in the cloud, you can follow the example below to set the path on the datastore. For more detailed information on valid path values, refer to this documentation - [manage pipeline inputs outputs](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-manage-inputs-outputs-pipeline?view=azureml-api-2&tabs=cli#path-and-mode-for-data-inputsoutputs)."
"Declaring the pipeline output is optional. However, if you require a customized output path in the cloud, you can follow the example below to set the path on the datastore. For more detailed information on valid path values, refer to this documentation - [manage pipeline inputs outputs](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-manage-inputs-outputs-pipeline?view=azureml-api-2&tabs=cli)."
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,6 @@
"AutoGen offers conversable agents powered by LLM, tool or human, which can be used to perform tasks collectively via automated chat. This framework allows tool use and human participation through multi-agent conversation.\n",
"Please find documentation about this feature [here](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat).\n",
"\n",
"This notebook is modified based on [autogen agent chat example](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_groupchat.ipynb). \n",
"\n",
"**Learning Objectives** - Upon completing this tutorial, you should be able to:\n",
"\n",
"- Trace LLM (OpenAI) Calls and visualize the trace of your application.\n",
Expand Down Expand Up @@ -45,7 +43,7 @@
"\n",
"You can create the config file named `OAI_CONFIG_LIST.json` from example file: `OAI_CONFIG_LIST.json.example`.\n",
"\n",
"Below code use the [`config_list_from_json`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file. \n"
"Below code use the [`config_list_from_json`](https://microsoft.github.io/autogen/0.2/docs/reference/oai/openai_utils/#config_list_from_json) function loads a list of configurations from an environment variable or a json file. \n"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion examples/tutorials/tracing/langchain/trace-langchain.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
"The tracing capability provided by Prompt flow is built on top of [OpenTelemetry](https://opentelemetry.io/) that gives you complete observability over your LLM applications. \n",
"And there is already a rich set of OpenTelemetry [instrumentation packages](https://opentelemetry.io/ecosystem/registry/?language=python&component=instrumentation) available in OpenTelemetry Eco System. \n",
"\n",
"In this example we will demo how to use [opentelemetry-instrumentation-langchain](https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-langchain) package provided by [Traceloop](https://www.traceloop.com/) to instrument [LangChain](https://python.langchain.com/docs/get_started/quickstart) apps.\n",
"In this example we will demo how to use [opentelemetry-instrumentation-langchain](https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-langchain) package provided by [Traceloop](https://www.traceloop.com/) to instrument [LangChain](https://python.langchain.com/docs/tutorials/) apps.\n",
"\n",
"\n",
"**Learning Objectives** - Upon completing this tutorial, you should be able to:\n",
Expand Down

0 comments on commit 9345ad3

Please sign in to comment.