Run small ELK stack locally within a Docker environment.
Largely based on docker-elk but without the xpack configuration, as such this is:
FOR LOCAL USE ONLY
An ELK stack provides a simple way to analyse application logs. We do this in PROD.
In PROD, we typically use the kinesis logback appender to ship logs to ELK.
Locally, we typically write logs to disk. Whilst we can tail
these to see them, it isn't very easy.
This becomes more obvious when we start to log with markers.
Markers add structure to logs and allow us to group logs together.
For example, in a REST API, if we log the method
as a marker we can easily search for DELETE
requests.
./script/setup
./script/start
- Open
https://logs.local.dev-gutools.co.uk
in the browser to access Kibana - Ship logs from your application
local-elk provides two ways to ingest logs:
- on TCP port 5000
- on a kinesis stream (running in localstack) called
local-elk-logging-kinesis-stream
local-elk accepts tcp input on port 5000 in the JSON codec format.
We can use the LogstashTcpSocketAppender
to ship logs to it from our application.
TIP: You'll likely want to have a guard to only use the LogstashTcpSocketAppender
when in DEV and when you know local-elk is running.
Examples assume you are using the Play! framework in Scala.
Add an appender to your logback configuration file:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>127.0.0.1:5000</destination>
<encoder class="net.logstash.logback.encoder.LogstashEncoder"/>
</appender>
<root level="INFO">
<appender-ref ref="LOGSTASH"/>
</root>
</configuration>
More information here.
While shipping logs via the logback configuration is fine, we may want to augment logs with more information prior to writing them.
For example, in PROD we'd add the Stage
, Stack
and App
tags.
Within Play! logging is usually setup in the ApplicationLoader
.
To ship logs to local-elk we need to add a LogstashTcpSocketAppender
to the Logger
:
// In AppLoader.scala
import play.api.ApplicationLoader.Context
import play.api.{Application, ApplicationLoader}
class AppLoader extends ApplicationLoader {
final override def load(context: Context): Application = {
LogConfig.initLocalLogShipping
}
}
// In LogConfig.scala
import ch.qos.logback.classic.{LoggerContext, Logger => LogbackLogger}
import net.logstash.logback.appender.LogstashTcpSocketAppender
import net.logstash.logback.encoder.LogstashEncoder
import org.slf4j.{LoggerFactory, Logger => SLFLogger}
import java.net.InetSocketAddress
import play.api.libs.json._
import scala.util.Try
object LogConfig {
private val rootLogger: LogbackLogger = LoggerFactory.getLogger(SLFLogger.ROOT_LOGGER_NAME).asInstanceOf[LogbackLogger]
private val BUFFER_SIZE = 1000
private def createCustomFields(): String = {
Json.toJson(Map(
"stack" -> "local-elk",
"stage" -> "DEV",
"app" -> "demo"
)).toString()
}
private def createLogstashAppender(context: LoggerContext): LogstashTcpSocketAppender = {
val customFields = createCustomFields()
val appender = new LogstashTcpSocketAppender()
appender.setContext(context)
appender.addDestinations(new InetSocketAddress("localhost", 5000))
appender.setWriteBufferSize(BUFFER_SIZE)
val encoder = new LogstashEncoder()
encoder.setCustomFields(customFields)
encoder.start()
appender.setEncoder(encoder)
appender.start()
appender
}
def initLocalLogShipping: Unit = {
Try {
rootLogger.info("Configuring local logstash log shipping")
val appender = createLogstashAppender(rootLogger.getLoggerContext)
rootLogger.addAppender(appender)
rootLogger.info("Local logstash log shipping configured")
} recover {
case e => rootLogger.error("LogConfig Failed!", e)
}
}
}
You can configure your application to write to the kinesis stream local-elk-logging-kinesis-stream
. As this is running in localstack, you'll also need to set a custom endpoint to http://localhost:4566
.
val endpoint = new EndpointConfiguration("http://localhost:4566", "eu-west-1")
val client = AmazonKinesisClientBuilder.standard().withEndpointConfiguration(endpoint).build()
We could use the File input plugin to ship logs from disk.
However, this has a couple of problems:
- The files need to be mounted into the Logstash container of local-elk. This doesn't scale well and isn't very generic.
- The log files on disk need to be in the
net.logstash.logback.encoder.LogstashEncoder
encoder to create JSON data. This is less human readable, in the event that reading these files becomes necessary.