Inspired by devopstales pfsense parser.
Using this guide we are able to take logs generated from Snort Barnyard2 (within pfSense) and parse them in Graylog to be able to use the information to pipe into Grafana.
- pfSense with Snort running
- Graylog (Version 3.2.0+)
- Grafana (Optional, but recommended, see Grafana section for requirements)
If you don't have those 3 running, you'll need to get them setup in your environment before continuing.
Note: If you're running a separate elasticsearch backend other than what Graylog uses you cannot run the OSS version with this.
I call this the "pre-configuration" because it's what we need to do before we get into the real meat and potatoes of this. Follow the steps below to get Graylog ready to parse logs from Snort within pfSense.
-
Create a new index set with the settings below
-
Download the
snort_barnyard2_graylog_content_pack.json
from this repository and go to System -> Content Packs click "Upload" in the top right and upload the JSON file.
This content pack will create the inputs, streams, pipelines, pipeline rules, lookup tables, lookup caches and lookup tables needed to properly parse the needed logs.
- Go to Streams and edit the
Snort Barnyard2 Logs
stream, checkRemove matches from ‘All messages’ stream
and set the index set to the one you just created. We checkRemove matches from ‘All messages’ stream
so that we don't store messages twice and we set the index set so it works with our pipelines.
Awesome, Graylog is now "pre-configured" for what we need to do. Lets move onto the next section.
Now that you have the content pack installed to fully utilize it and get IP Geo-location you'll need to download the MaxMind GeoLite2 Database (MMDB format) and place the file on your Graylog server.
- Go to MaxMind and click Sign Up For GeoLite2 at the bottom.
- Create an account with MaxMind and sign in
- Once you're signed in, click "Download Databases"
- Click Download GZIP next to GeoLite2 City (DO NOT DOWNLOAD THE CSV FORMAT)
- Extract the zip file and place the
GeoLite2-City.mmdb
file in /etc/graylog/server/ on your graylog server.Note: You may need to chmod
GeoLite2-City.mmdb
to 744 and chown the file to whoever owns the rest of the files in that directory for your install - In Graylog go to System -> Configurations and click Update under Geo-Location Processor
- Set
/etc/graylog/server/GeoLite2-City.mmdb
as the path and choose City Database as the type and click Save - Scroll to the top and click Update under Message Processors Configuration and change the order to what is below
Underneath the hood of Graylog runs Elasticsearch. Elasticsearch is what is storing our logs in "indexes". We need to use a tool called Cerebro to modify our Barnyard2 Logs
index so that it templates the coordinates properly.
You'll need to download Cerebro and be able to run it from a Linux box. I personally run this from my Graylog server when needed. So go to the Cerebro link above and git clone
the repository to your home directory.
Now that we have the repository cloned we're going to go into cerebro-*/bin
folder and run Cerebro. I launch it using a few custom variables to not allow it to run on its default port of 9000. Use the command below to run Cerebro.
./cerebro -Dhttp.port=9091 -Dhttp.address=X.X.X.X
Change X.X.X.X to the primary IP of the server you're running Cerebro on.
Once you have Cerebro running navigate to the web interface in your browser by going to https://X.X.X.X:9091 and target your graylog server IP and port for Elasticsearch (default 9200). Ex. http://10.1.1.1:9200
and then click "Connect".
Now that you're in Cerebro we need to create an index template. Go to More -> Index Templates
and on the right-hand side you can Create a New Template. We're going to call this template Barnyard2-Custom
and then copy and paste the contents of elasticsearch_custom_template.json
into the Template section. Once you've done that click Create at the bottom.
Now that we've created the template we need to stop the Graylog service by running systemctl stop graylog-server
on your Graylog server. Once that is stopped we need to delete the barnyard_0
index visible under Overview in Cerebro.
Now that it is deleted we can start graylog-server again using systemctl start graylog-server
.
Okay, we have Graylog completely configured. The last step is to now pipe logs from Snort into Graylog. Follow the steps below to complete this.
-
Login to pfSense and go to Services -> Snort
-
Edit the interface you want to get logs from (most likely your WAN interface)
-
Navigate to WAN Barnyard2
-
Check the top box
Enable barnyard2 for this interface. You will also need to enable at least one logging destination below.
-
Check
Enable logging of alerts to a local or remote syslog receiver.
underSyslog Output Settings
-
Set the remote host to your Graylog server IP and set the port to 10001 (Barnyard2 Graylog Input Port)
-
Click
Save
at the bottom
We now need to confirm that Graylog is receiving all the logs for Snort. We can do this by going to Streams -> Snort Barnyard2 Logs and making sure we're receiving messages. If you click into a message you should see variables such as src_addr
, src_addr_geo_location
, dst_addr
, dst_addr_geo_location
, etc.
Now that we have logs within Graylog for Snort and we're receiving Geo Location coordinates we can map those coordinates to World Maps within Grafana. To complete this section you'll need:
- Grafana already running in your environment
- An understanding of how Grafana Panels work
- World Map Grafana Plugin
First things first, we need to add Graylog as a source to Grafana. We can do this by adding a new Elasticsearch data source and configuring it like the image below. In the URL
box put https://X.X.X.X:9200 (replace X.X.X.X with the IP of your Graylog/Elasticsearch server).
Now that we have our data source we can import the snort_grafana_dashboard.json
file in this repository to Grafana. This will give you a very basic starting dashboard for Snort that shows an Incoming connection map, top city, top country, top source ip, top classification, top attack and top destination port.
To import the dashboard:
- Go to Dashboards -> Manage
- Click Import in the top-ish right
- Click Upload .JSON and select the JSON file you downloaded from the repo
- Click Load
You have now uploaded the dashboard but you'll need to edit each panel to target the newly created Elasticsearch data source. Once you've changed the data source for each panel you should be off to the races!
Enjoy your new parsed Snort logs and Grafana dashboard!