Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

conversations SDK #20947

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
55 commits
Select commit Hold shift + click to select a range
b8a897e
latest swagger - LLC regeneration
Sep 14, 2021
03cfb41
using the new autorest generator
Sep 14, 2021
92869d9
using generator with models enabled
Sep 14, 2021
fa6a03b
update readme
Sep 15, 2021
1be212c
adding more to readme
Sep 15, 2021
974f60e
white space
Sep 15, 2021
5f5948b
suppress pylint failures for generated code
Sep 15, 2021
f0c8372
fix broken links
Sep 15, 2021
e7d6f17
testing relative path
Sep 15, 2021
63dbb11
update library requirements
Sep 16, 2021
d447e85
fix link errors in readme
Sep 16, 2021
3d9911c
fix failing tests
mshaban-msft Sep 26, 2021
1d04de7
tmp commit
mshaban-msft Sep 28, 2021
80bbfb0
fix failing tests
mshaban-msft Sep 28, 2021
6d6ee9c
fix model mapping problem in workflow result
mshaban-msft Sep 28, 2021
05690bb
fix workflow tests
mshaban-msft Sep 28, 2021
e767425
skip directTarget tests for now
mshaban-msft Sep 28, 2021
8f25651
adding async tests
mshaban-msft Sep 29, 2021
a501891
fixing remaning tests
mshaban-msft Sep 29, 2021
0bd84dc
add recorded tests
mshaban-msft Sep 29, 2021
397e735
regenerate LLC client
mshaban-msft Sep 29, 2021
bcd5127
regenerate
mshaban-msft Sep 29, 2021
b04f497
fix tests after regen
mshaban-msft Sep 29, 2021
2c996f2
fix async test models
mshaban-msft Sep 29, 2021
68d2611
[samples] adding sample auth
mshaban-msft Sep 30, 2021
62873ec
install new azure-core, fixing async tests, re-record
mshaban-msft Sep 30, 2021
b68f0dd
recording for python 2.7
mshaban-msft Sep 30, 2021
332fa31
fixing azure-core requirements
mshaban-msft Sep 30, 2021
6f97b83
update lib requirements
mshaban-msft Sep 30, 2021
5d0cf26
updating requirements
mshaban-msft Sep 30, 2021
85014fa
update encoding
mshaban-msft Sep 30, 2021
0d0be46
fixing python 2.7 test error
mshaban-msft Sep 30, 2021
106343a
attempt py27 test fix
mshaban-msft Sep 30, 2021
550d876
add disclaimer for py27
mshaban-msft Oct 1, 2021
62044e3
add python classifiers
mshaban-msft Oct 1, 2021
3fbd806
update setup.py
mshaban-msft Oct 1, 2021
c5a0b81
minor updates to readme
mshaban-msft Oct 1, 2021
5508fb3
add CLU conversations app sample
mshaban-msft Oct 1, 2021
4853068
add workflow project sample
mshaban-msft Oct 1, 2021
b32ef46
add sample workflow app with parms
mshaban-msft Oct 1, 2021
0b8e3b8
adding remaining samples
mshaban-msft Oct 1, 2021
e74fb19
update samples readme
mshaban-msft Oct 1, 2021
9c5b3ab
update readme
mshaban-msft Oct 1, 2021
eb63602
add async samples
mshaban-msft Oct 1, 2021
b7a9a6b
tmp
mshaban-msft Oct 1, 2021
7ea02d7
async samples
mshaban-msft Oct 1, 2021
d9065b4
update async samples
mshaban-msft Oct 1, 2021
6c84e6c
update samples readme links
mshaban-msft Oct 1, 2021
ff124ac
update readme (pending examples)
mshaban-msft Oct 1, 2021
98fbda6
add examples to readme
mshaban-msft Oct 1, 2021
a267ac3
update readme links
mshaban-msft Oct 1, 2021
2d37532
resolve comments for setup.py
mshaban-msft Oct 1, 2021
2bb41c9
update comments for readme
mshaban-msft Oct 1, 2021
4e4052e
resolve comments for samples readme
mshaban-msft Oct 1, 2021
017db80
resolve comments for samples
mshaban-msft Oct 1, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion eng/tox/allowed_pylint_failures.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,5 +58,6 @@
"azure-messaging-nspkg",
"azure-agrifood-farming",
"azure-eventhub",
"azure-ai-language-questionanswering"
"azure-ai-language-questionanswering",
"azure-ai-language-conversations"
mshaban-msft marked this conversation as resolved.
Show resolved Hide resolved
]
196 changes: 192 additions & 4 deletions sdk/cognitivelanguage/azure-ai-language-conversations/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,16 @@
[![Build Status](https://dev.azure.com/azure-sdk/public/_apis/build/status/azure-sdk-for-python.client?branchName=main)](https://dev.azure.com/azure-sdk/public/_build/latest?definitionId=46?branchName=main)

# Azure Cognitive Language Services Conversations client library for Python
# Azure Conversational Language Understanding client library for Python
Conversational Language Understanding, aka **CLU** for short, is a cloud-based conversational AI service which is mainly used in bots to extract useful information from user utterance (natural language processing).
The CLU **analyze api** encompasses two projects; deepstack, and workflow projects.
You can use the "deepstack" project if you want to extract intents (intention behind a user utterance), and custom entities.
You can also use the "workflow" project which orchestrates multiple language apps to get the best response (language apps like Question Answering, Luis, and Deepstack).

[Source code][conversationallanguage_client_src] | [Package (PyPI)][conversationallanguage_pypi_package] | [API reference documentation][conversationallanguage_refdocs] | [Product documentation][conversationallanguage_docs] | [Samples][conversationallanguage_samples]

## _Disclaimer_

_Azure SDK Python packages support for Python 2.7 is ending 01 January 2022. For more information and questions, please refer to https://github.com/Azure/azure-sdk-for-python/issues/20691_


## Getting started
Expand All @@ -9,7 +19,7 @@

* Python 2.7, or 3.6 or later is required to use this package.
* An [Azure subscription][azure_subscription]

* An existing Text Analytics resource

> Note: the new unified Cognitive Language Services are not currently available for deployment.
mshaban-msft marked this conversation as resolved.
Show resolved Hide resolved

Expand All @@ -22,21 +32,186 @@ pip install azure-ai-language-conversations
```

### Authenticate the client

In order to interact with the CLU service, you'll need to create an instance of the [ConversationAnalysisClient][conversationanalysis_client_class] class. You will need an **endpoint**, and an **API key** to instantiate a client object. For more information regarding authenticating with Cognitive Services, see [Authenticate requests to Azure Cognitive Services][cognitive_auth].

#### Get an API key
You can get the **endpoint** and an **API key** from the Cognitive Services resource in the [Azure Portal][azure_portal].

Alternatively, use the [Azure CLI][azure_cli] command shown below to get the API key from the Cognitive Service resource.

```powershell
az cognitiveservices account keys list --resource-group <resource-group-name> --name <resource-name>
```


#### Create ConversationAnalysisClient
Once you've determined your **endpoint** and **API key** you can instantiate a `ConversationAnalysisClient`:

```python
from azure.core.credentials import AzureKeyCredential
from azure.ai.language.conversations import ConversationAnalysisClient

endpoint = "https://<resource-name>.api.cognitive.microsoft.com"
credential = AzureKeyCredential("<api-key>")
client = ConversationAnalysisClient(endpoint, credential)
```


## Key concepts

### ConversationAnalysisClient

The [ConversationAnalysisClient][conversationanalysis_client_class] is the primary interface for making predictions using your deployed Conversations models. For asynchronous operations, an async `ConversationAnalysisClient` is in the `azure.ai.language.conversation.aio` namespace.

## Examples
The `azure-ai-language-conversation` client library provides both synchronous and asynchronous APIs.

The following examples show common scenarios using the `client` [created above](#create-conversationanalysisclient).

### Analzye a conversation with a Deepstack App
If you would like to extract custom intents and entities from a user utterance, you can call the `client.analyze_conversations()` method with your deepstack's project name as follows:
```python
# import libraries
import os
from azure.core.credentials import AzureKeyCredential

from azure.ai.language.conversations import ConversationAnalysisClient
from azure.ai.language.conversations.models import AnalyzeConversationOptions

# get secrets
conv_endpoint = os.environ.get("AZURE_CONVERSATIONS_ENDPOINT"),
conv_key = os.environ.get("AZURE_CONVERSATIONS_KEY"),
conv_project = os.environ.get("AZURE_CONVERSATIONS_PROJECT"),

# prepare data
query = "One california maki please."
input = AnalyzeConversationOptions(
query=query
)

# analyze quey
client = ConversationAnalysisClient(conv_endpoint, AzureKeyCredential(conv_key))
with client:
result = client.analyze_conversations(
input,
project_name=conv_project,
deployment_name='production'
)

# view result
print("query: {}".format(result.query))
print("project kind: {}\n".format(result.prediction.project_kind))

print("view top intent:")
print("top intent: {}".format(result.prediction.top_intent))
print("\tcategory: {}".format(result.prediction.intents[0].category))
print("\tconfidence score: {}\n".format(result.prediction.intents[0].confidence_score))

print("view entities:")
for entity in result.prediction.entities:
print("\tcategory: {}".format(entity.category))
print("\ttext: {}".format(entity.text))
print("\tconfidence score: {}".format(entity.confidence_score))
```

### Analzye conversation with a Workflow App
If you would like to pass the user utterance to your orchestrator (worflow) app, you can call the `client.analyze_conversations()` method with your workflow's project name. The orchestrator project simply orchestrates the submitted user utterance between your language apps (Luis, Deepstack, and Question Answering) to get the best response according to the user intent. See the next example:

```python
# import libraries
import os
from azure.core.credentials import AzureKeyCredential

from azure.ai.language.conversations import ConversationAnalysisClient
from azure.ai.language.conversations.models import AnalyzeConversationOptions

# get secrets
conv_endpoint = os.environ.get("AZURE_CONVERSATIONS_ENDPOINT"),
conv_key = os.environ.get("AZURE_CONVERSATIONS_KEY"),
workflow_project = os.environ.get("AZURE_CONVERSATIONS_WORKFLOW_PROJECT")

# prepare data
query = "How do you make sushi rice?",
input = AnalyzeConversationOptions(
query=query
)

# analyze query
client = ConversationAnalysisClient(conv_endpoint, AzureKeyCredential(conv_key))
with client:
result = client.analyze_conversations(
input,
project_name=workflow_project,
deployment_name='production',
)

# view result
print("query: {}".format(result.query))
print("project kind: {}\n".format(result.prediction.project_kind))

print("view top intent:")
print("top intent: {}".format(result.prediction.top_intent))
print("\tcategory: {}".format(result.prediction.intents[0].category))
print("\tconfidence score: {}\n".format(result.prediction.intents[0].confidence_score))

print("view Question Answering result:")
print("\tresult: {}\n".format(result.prediction.intents[0].result))
```

### Analzye conversation with a Workflow (Direct) App
If you would like to use an orchestrator (workflow) app, and you want to call a specific one of your language apps directly, you can call the `client.analyze_conversations()` method with your workflow's project name and the diirect target name which corresponds to your one of you language apps as follows:

```python
# import libraries
import os
from azure.core.credentials import AzureKeyCredential

from azure.ai.language.conversations import ConversationAnalysisClient
from azure.ai.language.conversations.models import AnalyzeConversationOptions

# get secrets
conv_endpoint = os.environ.get("AZURE_CONVERSATIONS_ENDPOINT"),
conv_key = os.environ.get("AZURE_CONVERSATIONS_KEY"),
workflow_project = os.environ.get("AZURE_CONVERSATIONS_WORKFLOW_PROJECT")

# prepare data
query = "How do you make sushi rice?",
target_intent = "SushiMaking"
input = AnalyzeConversationOptions(
query=query,
direct_target=target_intent,
parameters={
"SushiMaking": QuestionAnsweringParameters(
calling_options={
"question": query,
"top": 1,
"confidenceScoreThreshold": 0.1
}
)
}
)

# analyze query
client = ConversationAnalysisClient(conv_endpoint, AzureKeyCredential(conv_key))
with client:
result = client.analyze_conversations(
input,
project_name=workflow_project,
deployment_name='production',
)

# view result
print("query: {}".format(result.query))
print("project kind: {}\n".format(result.prediction.project_kind))

print("view top intent:")
print("top intent: {}".format(result.prediction.top_intent))
print("\tcategory: {}".format(result.prediction.intents[0].category))
print("\tconfidence score: {}\n".format(result.prediction.intents[0].confidence_score))

print("view Question Answering result:")
print("\tresult: {}\n".format(result.prediction.intents[0].result))
```



## Optional Configuration
Expand Down Expand Up @@ -74,6 +249,7 @@ This project has adopted the [Microsoft Open Source Code of Conduct][code_of_con
[azure_cli]: https://docs.microsoft.com/cli/azure/
[azure_portal]: https://portal.azure.com/
[azure_subscription]: https://azure.microsoft.com/free/

[cla]: https://cla.microsoft.com
[coc_contact]: mailto:opencode@microsoft.com
[coc_faq]: https://opensource.microsoft.com/codeofconduct/faq/
Expand All @@ -86,4 +262,16 @@ This project has adopted the [Microsoft Open Source Code of Conduct][code_of_con
[azure_core_readme]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/core/azure-core/README.md
[pip_link]:https://pypi.org/project/pip/

[conversationallanguage_client_src]: https://github.com/Azure/azure-sdk-for-python/main/sdk/cognitivelanguage/azure-ai-language-conversations

[conversationallanguage_pypi_package]: https://github.com/Azure/azure-sdk-for-python/main/sdk/cognitivelanguage/azure-ai-language-conversations

[conversationallanguage_refdocs]: https://github.com/Azure/azure-sdk-for-python/main/sdk/cognitivelanguage/azure-ai-language-conversations

[conversationallanguage_docs]: https://azure.microsoft.com/services/cognitive-services/language-understanding-intelligent-service/

[conversationallanguage_samples]: https://github.com/Azure/azure-sdk-for-python/main/sdk/cognitivelanguage/azure-ai-language-conversations/samples/README.md

[conversationanalysis_client_class]: https://github.com/Azure/azure-sdk-for-python/main/sdk/cognitivelanguage/azure-ai-language-conversations/azure/ai/language/conversations/_conversation_analysis_client.py

![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-python%2Fsdk%2Ftemplate%2Fazure-template%2FREADME.png)
Original file line number Diff line number Diff line change
Expand Up @@ -60,15 +60,14 @@ def send_request(
# type: (...) -> HttpResponse
"""Runs the network request through the client's chained policies.

We have helper methods to create requests specific to this service in `azure.ai.language.conversations.rest`.
Use these helper methods to create the request you pass to this method.

>>> from azure.core.rest import HttpRequest
>>> request = HttpRequest("GET", "https://www.example.org/")
<HttpRequest [GET], url: 'https://www.example.org/'>
>>> response = client.send_request(request)
<HttpResponse: 200 OK>

For more information on this code flow, see https://aka.ms/azsdk/python/protocol/quickstart

For advanced cases, you can also create your own :class:`~azure.core.rest.HttpRequest`
and pass it in.

:param request: The network request you want to make. Required.
:type request: ~azure.core.rest.HttpRequest
:keyword bool stream: Whether the response payload will be streamed. Defaults to False.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,15 +53,14 @@ def send_request(
) -> Awaitable[AsyncHttpResponse]:
"""Runs the network request through the client's chained policies.

We have helper methods to create requests specific to this service in `azure.ai.language.conversations.rest`.
Use these helper methods to create the request you pass to this method.

>>> from azure.core.rest import HttpRequest
>>> request = HttpRequest("GET", "https://www.example.org/")
<HttpRequest [GET], url: 'https://www.example.org/'>
>>> response = await client.send_request(request)
<AsyncHttpResponse: 200 OK>

For more information on this code flow, see https://aka.ms/azsdk/python/protocol/quickstart

For advanced cases, you can also create your own :class:`~azure.core.rest.HttpRequest`
and pass it in.

:param request: The network request you want to make. Required.
:type request: ~azure.core.rest.HttpRequest
:keyword bool stream: Whether the response payload will be streamed. Defaults to False.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,34 +26,34 @@ class ConversationAnalysisClientOperationsMixin:
@distributed_trace_async
async def analyze_conversations(
self,
conversation_analysis_input: "_models.ConversationAnalysisInput",
analyze_conversation_options: "_models.AnalyzeConversationOptions",
*,
project_name: str,
deployment_name: str,
**kwargs: Any
) -> "_models.ConversationAnalysisResult":
) -> "_models.AnalyzeConversationResult":
"""Analyzes the input conversation utterance.

:param conversation_analysis_input: Post body of the request.
:type conversation_analysis_input:
~azure.ai.language.conversations.models.ConversationAnalysisInput
:keyword project_name: The project name.
:param analyze_conversation_options: Post body of the request.
:type analyze_conversation_options:
~azure.ai.language.conversations.models.AnalyzeConversationOptions
:keyword project_name: The name of the project to use.
:paramtype project_name: str
:keyword deployment_name: The deployment name/deployed version.
:keyword deployment_name: The name of the specific deployment of the project to use.
:paramtype deployment_name: str
:return: ConversationAnalysisResult
:rtype: ~azure.ai.language.conversations.models.ConversationAnalysisResult
:return: AnalyzeConversationResult
:rtype: ~azure.ai.language.conversations.models.AnalyzeConversationResult
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ConversationAnalysisResult"]
cls = kwargs.pop('cls', None) # type: ClsType["_models.AnalyzeConversationResult"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))

content_type = kwargs.pop('content_type', "application/json") # type: Optional[str]

json = self._serialize.body(conversation_analysis_input, 'ConversationAnalysisInput')
json = self._serialize.body(analyze_conversation_options, 'AnalyzeConversationOptions')

request = build_analyze_conversations_request(
content_type=content_type,
Expand All @@ -67,15 +67,15 @@ async def analyze_conversations(
}
request.url = self._client.format_url(request.url, **path_format_arguments)

pipeline_response = await self._client.send_request(request, stream=False, _return_pipeline_response=True, **kwargs)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response

if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, pipeline_response)
raise HttpResponseError(response=response, model=error)

deserialized = self._deserialize('ConversationAnalysisResult', pipeline_response)
deserialized = self._deserialize('AnalyzeConversationResult', pipeline_response)

if cls:
return cls(pipeline_response, deserialized, {})
Expand Down
Loading