Skip to content

Commit

Permalink
[Monitor-ingestion] remove error status (#23585)
Browse files Browse the repository at this point in the history
  • Loading branch information
KarishmaGhiya authored Feb 15, 2023
1 parent 5a411f5 commit 9b72f4b
Show file tree
Hide file tree
Showing 50 changed files with 2,066 additions and 398 deletions.
5 changes: 4 additions & 1 deletion common/config/rush/pnpm-lock.yaml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 5 additions & 0 deletions sdk/monitor/monitor-ingestion/.vscode/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"cSpell.words": [
"pako"
]
}
9 changes: 7 additions & 2 deletions sdk/monitor/monitor-ingestion/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,20 @@
# Release History

## 1.0.0-beta.4 (Unreleased)
## 1.0.0 (2023-02-15)

### Features Added

- Added an error callback functionality for inspecting individual log uploads via `UploadLogsOptions`.

### Breaking Changes

### Bugs Fixed
- The `upload` API throws an `AggregateUploadLogsError` in case of an error and returns `void` when operation succeeds.
- Renamed `UploadLogsError` to `UploadLogsFailure`.

### Other Changes

- `UploadLogsOptions` supports the `OperationOptions` from @azure/core-client, allowing users to abort operations when needed.
- Added samples for all error handling scenarios.
## 1.0.0-beta.3 (2022-10-13)

### Breaking Changes
Expand Down
78 changes: 48 additions & 30 deletions sdk/monitor/monitor-ingestion/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Azure Monitor Ingestion client library for JS

The Azure Monitor Ingestion client library is used to send custom logs to [Azure Monitor][azure_monitor_overview].
The Azure Monitor Ingestion client library is used to send custom logs to [Azure Monitor][azure_monitor_overview] using the [Logs Ingestion API][ingestion_overview].

This library allows you to send data from virtually any source to supported built-in tables or to custom tables that you create in Log Analytics workspace. You can even extend the schema of built-in tables with custom columns.

Expand All @@ -16,9 +16,9 @@ This library allows you to send data from virtually any source to supported buil
### Prerequisites

- An [Azure subscription](https://azure.microsoft.com/free)
- A [Data Collection Endpoint](https://docs.microsoft.com/azure/azure-monitor/essentials/data-collection-endpoint-overview)
- A [Data Collection Rule](https://docs.microsoft.com/azure/azure-monitor/essentials/data-collection-rule-overview)
- A [Log Analytics workspace](https://docs.microsoft.com/azure/azure-monitor/logs/log-analytics-workspace-overview)
- A [Data Collection Endpoint](https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-endpoint-overview)
- A [Data Collection Rule](https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-rule-overview)
- A [Log Analytics workspace](https://learn.microsoft.com/azure/azure-monitor/logs/log-analytics-workspace-overview)

### Install the package

Expand All @@ -30,7 +30,7 @@ npm install @azure/monitor-ingestion

### Authenticate the client

An authenticated client is required to ingest data. To authenticate, create an instance of a [TokenCredential](https://docs.microsoft.com/javascript/api/@azure/core-auth/tokencredential?view=azure-node-latest) class (see [@azure/identity](https://www.npmjs.com/package/@azure/identity) for `DefaultAzureCredential` and other `TokenCredential` implementations). Pass it to the constructor of your client class.
An authenticated client is required to ingest data. To authenticate, create an instance of a [TokenCredential](https://learn.microsoft.com/javascript/api/@azure/core-auth/tokencredential?view=azure-node-latest) class (see [@azure/identity](https://www.npmjs.com/package/@azure/identity) for `DefaultAzureCredential` and other `TokenCredential` implementations). Pass it to the constructor of your client class.

To authenticate, the following example uses `DefaultAzureCredential` from the [@azure/identity](https://www.npmjs.com/package/@azure/identity) package:

Expand Down Expand Up @@ -65,17 +65,17 @@ The DCR must understand the structure of the input data and the structure of the
it can use a transformation to convert the source data to match the target table. You may also use the transform to
filter source data and perform any other calculations or conversions.

For more details, refer to [Data collection rules in Azure Monitor][data_collection_rule].
For more details, refer to [Data collection rules in Azure Monitor][data_collection_rule].For information on how to retrieve a DCR ID, see [this tutorial][data_collection_rule_tutorial].

### Log Analytics workspace tables

Custom logs can send data to any custom table that you create and to certain built-in tables in your Log Analytics
workspace. The target table must exist before you can send data to it. The following built-in tables are currently supported:

- [CommonSecurityLog](https://docs.microsoft.com/azure/azure-monitor/reference/tables/commonsecuritylog)
- [SecurityEvents](https://docs.microsoft.com/azure/azure-monitor/reference/tables/securityevent)
- [Syslog](https://docs.microsoft.com/azure/azure-monitor/reference/tables/syslog)
- [WindowsEvents](https://docs.microsoft.com/azure/azure-monitor/reference/tables/windowsevent)
- [CommonSecurityLog](https://learn.microsoft.com/azure/azure-monitor/reference/tables/commonsecuritylog)
- [SecurityEvents](https://learn.microsoft.com/azure/azure-monitor/reference/tables/securityevent)
- [Syslog](https://learn.microsoft.com/azure/azure-monitor/reference/tables/syslog)
- [WindowsEvents](https://learn.microsoft.com/azure/azure-monitor/reference/tables/windowsevent)

## Examples

Expand All @@ -86,10 +86,10 @@ You can familiarize yourself with different APIs using [Samples](https://github.

### Upload custom logs

You can create a client and call the client's `Upload` method. Take note of the data ingestion [limits](https://docs.microsoft.com/azure/azure-monitor/service-limits#custom-logs).
You can create a client and call the client's `Upload` method. Take note of the data ingestion [limits](https://learn.microsoft.com/azure/azure-monitor/service-limits#custom-logs).

```js
const { DefaultAzureCredential } = require("@azure/identity");
const { isAggregateUploadLogsError, DefaultAzureCredential } = require("@azure/identity");
const { LogsIngestionClient } = require("@azure/monitor-ingestion");

require("dotenv").config();
Expand All @@ -112,15 +112,23 @@ async function main() {
AdditionalContext: "context",
},
];
const result = await client.upload(ruleId, streamName, logs);
if (result.status !== "Success") {
console.log("Some logs have failed to complete ingestion. Upload status=", result.status);
for (const errors of result.errors) {
console.log(`Error - ${JSON.stringify(errors.cause)}`);
console.log(`Log - ${JSON.stringify(errors.failedLogs)}`);
try{
await client.upload(ruleId, streamName, logs);
}
catch(e){
let aggregateErrors = isAggregateUploadLogsError(e) ? e.errors : [];
if (aggregateErrors.length > 0) {
console.log("Some logs have failed to complete ingestion");
for (const error of aggregateErrors) {
console.log(`Error - ${JSON.stringify(error.cause)}`);
console.log(`Log - ${JSON.stringify(error.failedLogs)}`);
}
} else {
console.log(e);
}
}
}

main().catch((err) => {
console.error("The sample encountered an error:", err);
process.exit(1);
Expand Down Expand Up @@ -184,7 +192,7 @@ When uploading more than 1MB of logs in a single call to the `upload` method on

```js
const { DefaultAzureCredential } = require("@azure/identity");
const { LogsIngestionClient } = require("@azure/monitor-ingestion");
const { isAggregateUploadLogsError, LogsIngestionClient } = require("@azure/monitor-ingestion");

require("dotenv").config();

Expand All @@ -205,13 +213,20 @@ async function main() {
});
}

// Set the maximum concurrency to 1 to prevent concurrent requests entirely
const result = await client.upload(ruleId, streamName, logs, { maxConcurrency: 1 });
if (result.status !== "Success") {
console.log("Some logs have failed to complete ingestion. Upload status=", result.status);
for (const error of result.errors) {
console.log(`Error - ${JSON.stringify(error.cause)}`);
console.log(`Log - ${JSON.stringify(error.failedLogs)}`);
try{
// Set the maximum concurrency to 1 to prevent concurrent requests entirely
await client.upload(ruleId, streamName, logs, { maxConcurrency: 1 });
}
catch(e){
let aggregateErrors = isAggregateUploadLogsError(e) ? e.errors : [];
if (aggregateErrors.length > 0) {
console.log("Some logs have failed to complete ingestion");
for (const error of aggregateErrors) {
console.log(`Error - ${JSON.stringify(error.cause)}`);
console.log(`Log - ${JSON.stringify(error.failedLogs)}`);
}
} else {
console.log(e);
}
}
}
Expand Down Expand Up @@ -243,17 +258,20 @@ For detailed instructions on how to enable logs, see the [@azure/logger package

## Next steps

To learn more about Azure Monitor, see the [Azure Monitor service documentation][azure_monitor_overview].
To learn more about Azure Monitor, see the [Azure Monitor service documentation][azure_monitor_overview]. Please take a look at the samples directory for detailed [examples][azure_monitor_samples] on how to use this library.

## Contributing

If you'd like to contribute to this library, please read the [contributing guide](https://github.com/Azure/azure-sdk-for-js/blob/main/CONTRIBUTING.md) to learn more about how to build and test the code.

<!-- LINKS -->

[azure_monitor_overview]: https://docs.microsoft.com/azure/azure-monitor/overview
[data_collection_endpoint]: https://docs.microsoft.com/azure/azure-monitor/essentials/data-collection-endpoint-overview
[data_collection_rule]: https://docs.microsoft.com/azure/azure-monitor/essentials/data-collection-rule-overview
[azure_monitor_overview]: https://learn.microsoft.com/azure/azure-monitor/overview
[data_collection_endpoint]: https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-endpoint-overview
[data_collection_rule]: https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-rule-overview
[data_collection_rule_tutorial]: https://learn.microsoft.com/azure/azure-monitor/logs/tutorial-logs-ingestion-portal#collect-information-from-the-dcr
[ingestion_overview]: https://learn.microsoft.com/azure/azure-monitor/logs/logs-ingestion-api-overview
[azure_monitor_samples]: https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/monitor/monitor-ingestion/samples/v1-beta
[monitor_query]: https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/monitor/monitor-query

![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-js/sdk/monitor/monitor-ingestion/README.png)
9 changes: 6 additions & 3 deletions sdk/monitor/monitor-ingestion/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@azure/monitor-ingestion",
"version": "1.0.0-beta.4",
"version": "1.0.0",
"description": "Azure Monitor Ingestion library",
"sdk-type": "client",
"main": "dist/index.js",
Expand Down Expand Up @@ -44,7 +44,7 @@
"build": "npm run clean && tsc -p . && npm run build:nodebrowser && api-extractor run --local && npm run build:types",
"check-format": "prettier --list-different --config ../../../.prettierrc.json --ignore-path ../../../.prettierignore \"src/**/*.ts\" \"test/**/*.ts\" \"*.{js,json}\"",
"clean": "rimraf dist dist-* temp types *.tgz *.log",
"execute:samples": "echo Obselete",
"execute:samples": "echo Obsolete",
"extract-api": "tsc -p . && api-extractor run --local",
"format": "prettier --write --config ../../../.prettierrc.json --ignore-path ../../../.prettierignore \"src/**/*.ts\" \"test/**/*.ts\" \"samples-dev/**/*.ts\" \"*.{js,json}\"",
"generate:client": "autorest --typescript ./swagger/README.md",
Expand Down Expand Up @@ -87,16 +87,17 @@
"sideEffects": false,
"prettier": "@azure/eslint-plugin-azure-sdk/prettier.json",
"dependencies": {
"@azure/abort-controller": "^1.1.0",
"@azure/core-auth": "^1.3.0",
"@azure/core-client": "^1.5.0",
"@azure/core-rest-pipeline": "^1.4.0",
"@azure/core-tracing": "^1.0.0",
"@azure/core-util": "^1.0.0",
"@azure/logger": "^1.0.0",
"tslib": "^2.2.0",
"pako": "^2.0.4"
},
"devDependencies": {
"@azure/core-util": "^1.0.0",
"@azure-tools/test-credential": "~1.0.0",
"@azure-tools/test-recorder": "^2.0.0",
"@azure/dev-tool": "^1.0.0",
Expand All @@ -109,6 +110,7 @@
"@types/mocha": "^7.0.2",
"@types/node": "^14.0.0",
"@types/pako": "^2.0.0",
"@types/sinon": "^9.0.4",
"chai": "^4.2.0",
"cross-env": "^7.0.2",
"dotenv": "^16.0.0",
Expand All @@ -133,6 +135,7 @@
"nyc": "^15.0.0",
"prettier": "^2.5.1",
"rimraf": "^3.0.0",
"sinon": "^9.0.2",
"source-map-support": "^0.5.9",
"typescript": "~4.8.0",
"util": "^0.12.1"
Expand Down

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Loading

0 comments on commit 9b72f4b

Please sign in to comment.