logstash-output-azure_loganalytics is a logstash plugin to output to Azure Log Analytics. Logstash is an open source, server-side data processing pipeline that ingests data from a multitude of sources simultaneously, transforms it, and then sends it to your favorite destinations. Log Analytics is a service in Operations Management Suite (OMS) that helps you collect and analyze data generated by resources in your cloud and on-premises environments. It gives you real-time insights using integrated search and custom dashboards to readily analyze millions of records across all of your workloads and servers regardless of their physical location. The plugin stores in-coming events to Azure Log Analytics by leveraging Log Analytics HTTP Data Collector API
You can install this plugin using the Logstash "plugin" or "logstash-plugin" (for newer versions of Logstash) command:
bin/plugin install logstash-output-azure_loganalytics
# or
bin/logstash-plugin install logstash-output-azure_loganalytics (Newer versions of Logstash)
Please see Logstash reference for more information.
output {
azure_loganalytics {
customer_id => "<OMS WORKSPACE ID>"
shared_key => "<CLIENT AUTH KEY>"
log_type => "<LOG TYPE NAME>"
key_names => ['key1','key2','key3'..] ## list of Key names (array)
flush_items => <FLUSH_ITEMS_NUM>
flush_interval_time => <FLUSH INTERVAL TIME(sec)>
}
}
- customer_id (required) - Your Operations Management Suite workspace ID
- shared_key (required) - The primary or the secondary Connected Sources client authentication key.
- log_type (required) - The name of the event type that is being submitted to Log Analytics. This must be only alpha characters.
- time_generated_field (optional) - Default:''(empty string) The name of the time generated field. Be carefule that the value of field should strictly follow the ISO 8601 format (YYYY-MM-DDThh:mm:ssZ). See also this for more details
- key_names (optional) - Default:[] (empty array). list of Key names in in-coming record to deliver.
- flush_items (optional) - Default 50. Max number of items to buffer before flushing (1 - 1000).
- flush_interval_time (optional) - Default 5. Max number of seconds to wait between flushes.
Here is an example configuration where Logstash's event source and destination are configured as Apache2 access log and Azure Log Analytics respectively.
input {
file {
path => "/var/log/apache2/access.log"
start_position => "beginning"
}
}
filter {
if [path] =~ "access" {
mutate { replace => { "type" => "apache_access" } }
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
azure_loganalytics {
customer_id => "818f7bbc-8034-4cc3-b97d-f068dd4cd659"
shared_key => "ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksXxcBmQQHw==(dummy)"
log_type => "ApacheAccessLog"
key_names => ['logid','date','processing_time','remote','user','method','status','agent']
flush_items => 10
flush_interval_time => 5
}
# for debug
stdout { codec => rubydebug }
}
You can find example configuration files in logstash-output-azure_loganalytics/examples.
Now you run logstash with the the example configuration like this:
# Test your logstash configuration before actually running the logstash
bin/logstash -f logstash-apache2-to-loganalytics.conf --configtest
# run
bin/logstash -f logstash-apache2-to-loganalytics.conf
Here is an expected output for sample input (Apache2 access log):
Apache2 access log
106.143.121.169 - - [29/Dec/2016:01:38:16 +0000] "GET /test.html HTTP/1.1" 304 179 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36"
Output (rubydebug)
{
"message" => "106.143.121.169 - - [29/Dec/2016:01:38:16 +0000] \"GET /test.html HTTP/1.1\" 304 179 \"-\" \"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36\"",
"@version" => "1",
"@timestamp" => "2016-12-29T01:38:16.000Z",
"path" => "/var/log/apache2/access.log",
"host" => "yoichitest01",
"type" => "apache_access",
"clientip" => "106.143.121.169",
"ident" => "-",
"auth" => "-",
"timestamp" => "29/Dec/2016:01:38:16 +0000",
"verb" => "GET",
"request" => "/test.html",
"httpversion" => "1.1",
"response" => "304",
"bytes" => "179",
"referrer" => "\"-\"",
"agent" => "\"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36\""
}
Bug reports and pull requests are welcome on GitHub at https://github.com/yokawasa/logstash-output-azure_loganalytics.