Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion sdk/textanalytics/azure-ai-textanalytics/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Release History

## 1.0.0-beta.5 (Unreleased)
## 1.0.0-beta.5 (2020-05-27)
**New features**
- Added Text property and `getText()` to `SentenceSentiment`.
- `Warnings` property added to each document-level response object returned from the endpoints. It is a list of `TextAnalyticsWarnings`.
Expand All @@ -9,6 +9,8 @@
- Text analytics SDK update the service to version `v3.0` from `v3.0-preview.1`.

**Breaking changes**
- Removed pagination feature, which removed `TextAnalyticsPagedIterable`, `TextAnalyticsPagedFlux` and `TextAnalyticsPagedResponse`
- Removed overload methods for API that takes a list of String, only keep max-overload API that has a list of String, language or country hint, and `TextAnalyticsRequestOption`.
- Renamed `apiKey()` to `credential()` on TextAnalyticsClientBuilder.
- Removed `getGraphemeLength()` and `getGraphemeOffset()` from `CategorizedEntity`, `SentenceSentiment`, and `LinkedEntityMatch`.
- `getGraphemeCount()` in `TextDocumentStatistics` has been renamed to `getCharacterCount()`.
Expand Down
7 changes: 4 additions & 3 deletions sdk/textanalytics/azure-ai-textanalytics/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ and includes six main functions:
- [Cognitive Services or Text Analytics account][text_analytics_account] to use this package.

### Include the Package
**Note:** This version targets Azure Text Analytics service API version v3.0.

[//]: # ({x-version-update-start;com.azure:azure-ai-textanalytics;current})
```xml
Expand Down Expand Up @@ -146,8 +147,8 @@ or the number of operation transactions that have gone through, simply call `get
`TextDocumentStatistics` which contains both information.

### Return value collection
An operation result collection, such as `TextAnalyticsPagedResponse<AnalyzeSentimentResult>`, which is the collection of
the result of a Text Analytics analyzing sentiment operation. For `TextAnalyticsPagedResponse` includes the model
An operation result collection, such as `AnalyzeSentimentResultCollection`, which is the collection of
the result of a Text Analytics analyzing sentiment operation. It also includes the model
version of the operation and statistics of the batch documents.

### Operation on multiple documents
Expand Down Expand Up @@ -280,7 +281,7 @@ List<DetectLanguageInput> documents = Arrays.asList(
);

try {
textAnalyticsClient.detectLanguageBatch(documents, null, Context.NONE);
textAnalyticsClient.detectLanguageBatchWithResponse(documents, null, Context.NONE);
} catch (HttpResponseException e) {
System.out.println(e.getMessage());
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,16 @@
import com.azure.ai.textanalytics.implementation.models.SentimentResponse;
import com.azure.ai.textanalytics.implementation.models.WarningCodeValue;
import com.azure.ai.textanalytics.models.AnalyzeSentimentResult;
import com.azure.ai.textanalytics.util.AnalyzeSentimentResultCollection;
import com.azure.ai.textanalytics.models.SentenceSentiment;
import com.azure.ai.textanalytics.models.SentimentConfidenceScores;
import com.azure.ai.textanalytics.models.TextAnalyticsRequestOptions;
import com.azure.ai.textanalytics.models.TextAnalyticsWarning;
import com.azure.ai.textanalytics.models.TextDocumentInput;
import com.azure.ai.textanalytics.util.TextAnalyticsPagedFlux;
import com.azure.ai.textanalytics.util.TextAnalyticsPagedResponse;
import com.azure.ai.textanalytics.models.TextSentiment;
import com.azure.ai.textanalytics.models.WarningCode;
import com.azure.core.exception.HttpResponseException;
import com.azure.core.http.rest.Response;
import com.azure.core.http.rest.SimpleResponse;
import com.azure.core.util.Context;
import com.azure.core.util.IterableStream;
Expand All @@ -39,7 +41,7 @@
import static com.azure.ai.textanalytics.implementation.Utility.toMultiLanguageInput;
import static com.azure.ai.textanalytics.implementation.Utility.toTextAnalyticsError;
import static com.azure.ai.textanalytics.implementation.Utility.toTextDocumentStatistics;
import static com.azure.core.util.FluxUtil.fluxError;
import static com.azure.core.util.FluxUtil.monoError;
import static com.azure.core.util.FluxUtil.withContext;
import static com.azure.core.util.tracing.Tracer.AZ_TRACING_NAMESPACE_KEY;

Expand All @@ -61,55 +63,53 @@ class AnalyzeSentimentAsyncClient {
}

/**
* Helper function for calling service with max overloaded parameters that a returns {@link TextAnalyticsPagedFlux}
* which is a paged flux that contains {@link AnalyzeSentimentResult}.
* Helper function for calling service with max overloaded parameters that returns a mono {@link Response}
* which contains {@link AnalyzeSentimentResultCollection}.
*
* @param documents The list of documents to analyze sentiments for.
* @param options The {@link TextAnalyticsRequestOptions} request options.
*
* @return {@link TextAnalyticsPagedFlux} of {@link AnalyzeSentimentResult}.
* @return A mono {@link Response} contains {@link AnalyzeSentimentResultCollection}.
*/
TextAnalyticsPagedFlux<AnalyzeSentimentResult> analyzeSentimentBatch(Iterable<TextDocumentInput> documents,
TextAnalyticsRequestOptions options) {
public Mono<Response<AnalyzeSentimentResultCollection>> analyzeSentimentBatch(
Iterable<TextDocumentInput> documents, TextAnalyticsRequestOptions options) {
try {
inputDocumentsValidation(documents);
return new TextAnalyticsPagedFlux<>(() -> (continuationToken, pageSize) -> withContext(context ->
getAnalyzedSentimentResponseInPage(documents, options, context)).flux());
return withContext(context -> getAnalyzedSentimentResponse(documents, options, context));
} catch (RuntimeException ex) {
return new TextAnalyticsPagedFlux<>(() -> (continuationToken, pageSize) -> fluxError(logger, ex));
return monoError(logger, ex);
}
}

/**
* Helper function for calling service with max overloaded parameters that a returns {@link TextAnalyticsPagedFlux}
* which is a paged flux that contains {@link AnalyzeSentimentResult}.
* Helper function for calling service with max overloaded parameters that returns a mono {@link Response}
* which contains {@link AnalyzeSentimentResultCollection}.
*
* @param documents The list of documents to analyze sentiments for.
* @param options The {@link TextAnalyticsRequestOptions} request options.
* @param context Additional context that is passed through the Http pipeline during the service call.
*
* @return The {@link TextAnalyticsPagedFlux} of {@link AnalyzeSentimentResult}.
* @return A mono {@link Response} contains {@link AnalyzeSentimentResultCollection}.
*/
TextAnalyticsPagedFlux<AnalyzeSentimentResult> analyzeSentimentBatchWithContext(
Mono<Response<AnalyzeSentimentResultCollection>> analyzeSentimentBatchWithContext(
Iterable<TextDocumentInput> documents, TextAnalyticsRequestOptions options, Context context) {
try {
inputDocumentsValidation(documents);
return new TextAnalyticsPagedFlux<>(() -> (continuationToken, pageSize) ->
getAnalyzedSentimentResponseInPage(documents, options, context).flux());
return getAnalyzedSentimentResponse(documents, options, context);
} catch (RuntimeException ex) {
return new TextAnalyticsPagedFlux<>(() -> (continuationToken, pageSize) -> fluxError(logger, ex));
return monoError(logger, ex);
}
}

/**
* Helper method to convert the service response of {@link SentimentResponse} to {@link TextAnalyticsPagedResponse}
* of {@link AnalyzeSentimentResult}.
* Helper method to convert the service response of {@link SentimentResponse} to {@link Response} that contains
* {@link AnalyzeSentimentResultCollection}.
*
* @param response The {@link SimpleResponse} of {@link SentimentResponse} returned by the service.
*
* @return The {@link TextAnalyticsPagedResponse} of {@link AnalyzeSentimentResult} returned by the SDK.
* @return A {@link Response} contains {@link AnalyzeSentimentResultCollection}.
*/
private TextAnalyticsPagedResponse<AnalyzeSentimentResult> toTextAnalyticsPagedResponse(
private Response<AnalyzeSentimentResultCollection> toAnalyzeSentimentResultCollectionResponse(
SimpleResponse<SentimentResponse> response) {
final SentimentResponse sentimentResponse = response.getValue();
final List<AnalyzeSentimentResult> analyzeSentimentResults = new ArrayList<>();
Expand All @@ -130,11 +130,9 @@ private TextAnalyticsPagedResponse<AnalyzeSentimentResult> toTextAnalyticsPagedR
analyzeSentimentResults.add(new AnalyzeSentimentResult(documentError.getId(), null,
toTextAnalyticsError(documentError.getError()), null));
}
return new TextAnalyticsPagedResponse<>(
response.getRequest(), response.getStatusCode(), response.getHeaders(),
analyzeSentimentResults, null,
sentimentResponse.getModelVersion(),
sentimentResponse.getStatistics() == null ? null : toBatchStatistics(sentimentResponse.getStatistics()));
return new SimpleResponse<>(response,
new AnalyzeSentimentResultCollection(analyzeSentimentResults, sentimentResponse.getModelVersion(),
sentimentResponse.getStatistics() == null ? null : toBatchStatistics(sentimentResponse.getStatistics())));
}

/**
Expand All @@ -154,7 +152,7 @@ private AnalyzeSentimentResult convertToAnalyzeSentimentResult(DocumentSentiment
sentenceSentiment.getConfidenceScores();
final SentenceSentimentValue sentenceSentimentValue = sentenceSentiment.getSentiment();
return new SentenceSentiment(sentenceSentiment.getText(),
sentenceSentimentValue == null ? null : sentenceSentimentValue.toString(),
TextSentiment.fromString(sentenceSentimentValue == null ? null : sentenceSentimentValue.toString()),
new SentimentConfidenceScores(confidenceScorePerSentence.getNegative(),
confidenceScorePerSentence.getNeutral(), confidenceScorePerSentence.getPositive()));
}).collect(Collectors.toList());
Expand All @@ -163,7 +161,8 @@ private AnalyzeSentimentResult convertToAnalyzeSentimentResult(DocumentSentiment
final List<TextAnalyticsWarning> warnings = documentSentiment.getWarnings().stream().map(
warning -> {
final WarningCodeValue warningCodeValue = warning.getCode();
return new TextAnalyticsWarning(warningCodeValue == null ? null : warningCodeValue.toString(),
return new TextAnalyticsWarning(
WarningCode.fromString(warningCodeValue == null ? null : warningCodeValue.toString()),
warning.getMessage());
}).collect(Collectors.toList());

Expand All @@ -174,7 +173,7 @@ private AnalyzeSentimentResult convertToAnalyzeSentimentResult(DocumentSentiment
? null : toTextDocumentStatistics(documentSentiment.getStatistics()),
null,
new com.azure.ai.textanalytics.models.DocumentSentiment(
documentSentimentValue == null ? null : documentSentimentValue.toString(),
TextSentiment.fromString(documentSentimentValue == null ? null : documentSentimentValue.toString()),
new SentimentConfidenceScores(
confidenceScorePerLabel.getNegative(),
confidenceScorePerLabel.getNeutral(),
Expand All @@ -184,16 +183,16 @@ private AnalyzeSentimentResult convertToAnalyzeSentimentResult(DocumentSentiment
}

/**
* Call the service with REST response, convert to a {@link Mono} of {@link TextAnalyticsPagedResponse} of
* {@link AnalyzeSentimentResult} from a {@link SimpleResponse} of {@link SentimentResponse}.
* Call the service with REST response, convert to a {@link Mono} of {@link Response} which contains
* {@link AnalyzeSentimentResultCollection} from a {@link SimpleResponse} of {@link SentimentResponse}.
*
* @param documents A list of documents to be analyzed.
* @param options The {@link TextAnalyticsRequestOptions} request options.
* @param context Additional context that is passed through the Http pipeline during the service call.
*
* @return A {@link Mono} of {@link TextAnalyticsPagedResponse} of {@link AnalyzeSentimentResult}.
* @return A mono {@link Response} contains {@link AnalyzeSentimentResultCollection}.
*/
private Mono<TextAnalyticsPagedResponse<AnalyzeSentimentResult>> getAnalyzedSentimentResponseInPage(
private Mono<Response<AnalyzeSentimentResultCollection>> getAnalyzedSentimentResponse(
Iterable<TextDocumentInput> documents, TextAnalyticsRequestOptions options, Context context) {
return service.sentimentWithResponseAsync(
new MultiLanguageBatchInput().setDocuments(toMultiLanguageInput(documents)),
Expand All @@ -203,7 +202,7 @@ private Mono<TextAnalyticsPagedResponse<AnalyzeSentimentResult>> getAnalyzedSent
.doOnSubscribe(ignoredValue -> logger.info("A batch of documents - {}", documents.toString()))
.doOnSuccess(response -> logger.info("Analyzed sentiment for a batch of documents - {}", response))
.doOnError(error -> logger.warning("Failed to analyze sentiment - {}", error))
.map(this::toTextAnalyticsPagedResponse)
.map(this::toAnalyzeSentimentResultCollectionResponse)
.onErrorMap(throwable -> mapToHttpResponseExceptionIfExist(throwable));
}
}
Loading