- If you don't have a Raspberry Pi you can skip to Docker Setup and use the sample data
- Clone this repo on the Raspberry Pi
- Install dump1090
- Set the IP address below to the docker host (and not the IP address of the Raspberry Pi). This is where you'll transmit the messages to Kafka
# On the Raspberry Pi
cd raspberry-pi
export HOST_IP=192.168.1.129 # Docker host
./plane-kafka.py
On your host (probably your laptop or PC). Clone this repo
# Start the containers
docker-compose up -d
You will need a database of icao-to-aircraft mappings (in icao-to-aircraft.json) and a database of callsigns (callsign-details.json). A good source of data is https://openflights.org/data.html where you can find aircraft data suitable for your region
If you are in a hurry, icao-to-aircraft.json.sample and callsign-details.json.sample provide you basic records to experiment
docker-compose exec confluent bash
And within the container
confluent start
cd /scripts
./01_setup_topics
If you do not have database files, copy the sample files
cp -i icao-to-aircraft.json.sample icao-to-aircraft.json
cp -i callsign-details.json.sample callsign-details.json
Now load the files. This will load data into the icao-to-aircraft
and callsign-details
topics
./02_do_load
And then run ksql. If you recieve parse errors, try running the KSQL statments one by one manually instead of the entire script as one.
ksql
-- paste the commands from file 03_ksql.sql
exit
Still within the container finish the remaining setup steps
./04_elastic_dynamic_template
./05_set_connect
- Navigate to http://localhost:5601
- Create indexes for
locationtable
andcallsigntable
. Each should have aEVENT_TS
field marked as a timestamp - Use the Kibana managment page to import saved objects in
06_kibana_export.json