The context of this tip is a need for exploring a dataset from Azure Monitor. Kibana and Elasticsearch is a powerful combination for exploring a previously unknown set of data.
For the following commands the assumtion is that you have a Linux or OSX environment on your computer. A howto on the software requirements is out of scope for this document: docker, docker-compose, az cli, jq, git
Start a stack of Elastic analytics tools using docker and deviantony's project as a starting point. This will download and initialize a full Elastic stack
git clone https://github.com/deviantony/docker-elk.git
cd docker-elk
docker-compose up -d
Depending on your environment you should elevate with Azure Privileged Identity Management (Azure PIM) or log on to relevant subscription in https://portal.azure.com to find the workspace ID needed for exporting data.
From relevant Log Analytics workspace, get 24 hours worth of json data and pipe it through jq and gzip to produce a nicely formatted json file. E.g. Software inventory data stored in the table ConfigurationData so we specify this in a query
az login
az account set --subscription "MySubscriptionName"
workSpace=fa99b26f-2a91-416e-ae59-52112dc57a1b
az monitor log-analytics query -w $workSpace --analytics-query 'ConfigurationData' -t PT24H | jq -c '.[]' | gzip > 24h-data.json.gz
Spool the gz compressed file into Elasticsearch, again piping it through jq for extra formatting.
Note also that we call for creating an index called configdata-2021
. See links below for more details
gzcat 24h-data.json.gz| jq -c '. | {"index": {"_index": "configdata-2021", "_type": "doc_type"}}, .' | curl -u elastic:changeme -XPOST "http://localhost:9200/_bulk" -H 'Content-Type: application/json' --data-binary @-
If everything went well there should now be a lot of data in Elasticsearch ready for you to visualize and analyze using Kibana.
- Log on to the local Kibana instance http://localhost:5601, username
elastic
and passwordchangeme
. - Create index pattern http://localhost:5601/app/management/kibana/indexPatterns using TimeGenerated as time field from the
configdata-2021
index. Go back to read the output of the above command if the index name is not there - Go to Kibana to explore the new index pattern you created above, http://localhost:5601/app/discover
- Remember to set at relevant time period. E.g. last 24 hours is http://localhost:5601/app/discover#/?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-24h,to:now))
Stop the docker containers and remove the data volume from local computer
docker-compose down
docker volume rm docker-elk_elasticsearch
- Elasticsearch documentation site
- Consider watching a few YouTube videos on how to explore and query data. E.g.: https://www.youtube.com/watch?v=t3cebUxRliA
- Link to "Indexing bulk documents to Elasticsearch using jq"