You can either run this on bare-metal, or using the provided Docker Compose config.
Install the connector:
confluent-hub install --no-prompt confluentinc/kafka-connect-syslog:1.0.0-preview
Run Kafka Connect. Wait for it to be available, and then create the connector:
curl -i -X POST -H "Accept:application/json" \
-H "Content-Type:application/json" \
http://localhost:8083/connectors/ \
-d @syslog_udp_config.json
Where the config looks like:
{
"name": "syslog-udp",
"config": {
"tasks.max": "1",
"connector.class": "io.confluent.connect.syslog.SyslogSourceConnector",
"topic.prefix": "syslog",
"syslog.port": "42514",
"syslog.listener": "UDP",
"syslog.reverse.dns.remote.ip": "true"
}
}
Make sure you’re in the same folder as docker-compose.yml
, and then run
docker-compose up -d
This will bring up the necessary Confluent Platform stack, and once Kafka Connect has started, create the connector using the config in syslog_udp_config.json
.
Send some test syslog data to the connector:
logger -n 127.0.0.1 -P 42514 I ❤️ logs
See the data in the Kafka topic:
$ kafka-avro-console-consumer \
--bootstrap-server localhost:9092 \
--property schema.registry.url=http://localhost:8081 \
--topic syslog --from-beginning|jq '.'
{
"name": null,
"type": "RFC3164",
"message": {
"string": "I ❤️ logs\u0000"
},
"host": {
"string": "rmoff:"
},
"version": null,
"level": {
"int": 5
},
"tag": null,
"extension": null,
"severity": null,
"appName": null,
"facility": {
"int": 0
},
"remoteAddress": {
"string": "192.168.240.1"
},
"rawMessage": {
"string": "<5>Nov 9 13:59:50 rmoff: I ❤️ logs\u0000"
},
"processId": null,
"messageId": null,
"structuredData": null,
"deviceVendor": null,
"deviceProduct": null,
"deviceVersion": null,
"deviceEventClassId": null,
"date": 1541771990000
}