Skip to content

Latest commit

 

History

History
57 lines (46 loc) · 3.33 KB

azure-elk-analytics.md

File metadata and controls

57 lines (46 loc) · 3.33 KB

Elastic analytics environment for Azure Monitor data

The context of this tip is a need for exploring a dataset from Azure Monitor. Kibana and Elasticsearch is a powerful combination for exploring a previously unknown set of data.

For the following commands the assumtion is that you have a Linux or OSX environment on your computer. A howto on the software requirements is out of scope for this document: docker, docker-compose, az cli, jq, git

Deploy the Elastic stack

Start a stack of Elastic analytics tools using docker and deviantony's project as a starting point. This will download and initialize a full Elastic stack

git clone https://github.com/deviantony/docker-elk.git
cd docker-elk
docker-compose up -d

Get the Azure data set

Depending on your environment you should elevate with Azure Privileged Identity Management (Azure PIM) or log on to relevant subscription in https://portal.azure.com to find the workspace ID needed for exporting data.

From relevant Log Analytics workspace, get 24 hours worth of json data and pipe it through jq and gzip to produce a nicely formatted json file. E.g. Software inventory data stored in the table ConfigurationData so we specify this in a query

az login
az account set --subscription "MySubscriptionName"
workSpace=fa99b26f-2a91-416e-ae59-52112dc57a1b
az monitor log-analytics query -w $workSpace --analytics-query 'ConfigurationData' -t PT24H | jq -c '.[]' | gzip > 24h-data.json.gz

Index the dataset

Spool the gz compressed file into Elasticsearch, again piping it through jq for extra formatting. Note also that we call for creating an index called configdata-2021. See links below for more details

gzcat 24h-data.json.gz| jq -c '.  | {"index": {"_index": "configdata-2021", "_type": "doc_type"}}, .' | curl -u elastic:changeme -XPOST "http://localhost:9200/_bulk" -H 'Content-Type: application/json' --data-binary @-

If everything went well there should now be a lot of data in Elasticsearch ready for you to visualize and analyze using Kibana.

Next steps

Cleanup

Stop the docker containers and remove the data volume from local computer

docker-compose down
docker volume rm docker-elk_elasticsearch

Links