Skip to content

Latest commit

 

History

History
244 lines (194 loc) · 9.89 KB

README.md

File metadata and controls

244 lines (194 loc) · 9.89 KB

log4stash

NOTE: This package is no longer maintained, I will be happy to get help with maintaing it.

Build status

log4stash is a log4net appender to log messages to the ElasticSearch document database. ElasticSearch offers robust full-text search engine and analyzation so that errors and messages can be indexed quickly and searched easily.

log4stash provides a few logging filters similar to the filters on logstash.

The origin of log4stash is @jptoto's log4net.ElasticSearch repository.

log4stash is based on RestSharp and Newtonsoft.Json but uses ILRepack to avoid nuget dependencies.

log4stash is fully open source, MIT licensed.

Features:

  • Supports .NET 4.5.2+ and .NET Core 2.0+
  • Easy installation and setup via Nuget
  • Ability to analyze the log event before sending it to ElasticSearch using built-in filters and custom filters similar to logstash.

Breaking Changes:

Navigate to breaking changes page here. See also Version notes page.

Filters:

  • Add - add new key and value to the event.
  • Remove - remove key from the event.
  • Rename - rename key to another name.
  • Kv - analyze value (default is to analyze the 'Message' value) and export key-value pairs using regex (similar to logstash's kv filter).
  • Grok - analyze value (default is 'Message') using custom regex and saved patterns (similar to logstash's grok filter).
  • ConvertToArray - split raw string to an array by given seperators.
  • Json - convert json string to an object (so it will be parsed as object in elasticsearch).
  • Convert - Available convertors: ToString, ToLower, ToUpper, ToInt and ToArray. See config example for more information.
  • Xml - Parse xml into an object.

Custom filter:

To add your own filters you just need to implement the interface IElasticAppenderFilter on your assembly and configure it on the log4net configuration file.

Issues:

I do my best to reply to issues or questions ASAP. Please use the ISSUES page to submit questions or errors.

Configuration Examples:

Almost all the parameters are optional, to see the default values check the c'tor of the appender and the c'tor of every filter. You can also set any public property in the appender/filter which didn't appear in the example.

Simple configuration:
<appender name="ElasticSearchAppender" type="log4stash.ElasticSearchAppender, log4stash">
    <Server>localhost</Server>
    <Port>9200</Port>
    <ElasticFilters>
      <!-- example of using filter with default parameters -->
      <kv /> 
    </ElasticFilters>
</appender>
(Almost) Full configuration:
<appender name="ElasticSearchAppender" type="log4stash.ElasticSearchAppender, log4stash">
	<Server>localhost</Server>
	<Port>9200</Port>
	<!-- optional: in case elasticsearch is located behind a reverse proxy the URL is like http://Server:Port/Path, default = empty string -->
	<Path>/es5</Path>
	<!-- The time zone for the formatter is based on the character before the index. '+' = local time, '~' = utc time -->
	<IndexName>log_test_%{+yyyy-MM-dd}</IndexName>
	<!-- type support was removed in ElasticSearch 7, so if not defined in configuration there won't be a type in the request -->
	<IndexType>LogEvent</IndexType>
	<BulkSize>2000</BulkSize>
	<BulkIdleTimeout>10000</BulkIdleTimeout>
	<IndexAsync>False</IndexAsync>
	<DropEventsOverBulkLimit>False</DropEventsOverBulkLimit>

	<!-- Serialize log object as json (default is true).
      -- This in case you log the object this way: `logger.Debug(obj);` and not: `logger.Debug("string");` -->
	<SerializeObjects>True</SerializeObjects> 

	<!-- optional: elasticsearch timeout for the request, default = 10000 -->
	<ElasticSearchTimeout>10000</ElasticSearchTimeout>

	<!-- optional: ssl connection -->
	<Ssl>False</Ssl>
	<AllowSelfSignedServerCert>False</AllowSelfSignedServerCert>

	<!--You can add parameters to the request to control the parameters sent to ElasticSearch.
    for example, as you can see here, you can add a custom id source to the appender.
    The Key is the key to be added to the request, and the value is the parameter's name in the log event properties.-->
	<IndexOperationParams>
		<Parameter>
			<Key>_id</Key>
			<Value>%{IdSource}</Value>
		</Parameter>
		<Parameter>
			<Key>key</Key>
			<Value>value</Value>
		</Parameter>
	</IndexOperationParams>

	<!-- for more information read about log4net.Core.FixFlags -->
	<FixedFields>Partial</FixedFields>

	<Template>
		<Name>templateName</Name>
		<FileName>path2template.json</FileName>
	</Template>

	<!--Only one credential type can be used at once-->
	<!--Here we list all possible types-->
	<AuthenticationMethod>
		<!--For basic authentication purposes-->
		<Basic>
			<Username>Username</Username>
			<Password>Password</Password>
		</Basic>
		<!--For AWS ElasticSearch service-->
		<Aws>
			<Aws4SignerSecretKey>Secret</Aws4SignerSecretKey>
			<Aws4SignerAccessKey>AccessKey</Aws4SignerAccessKey>
			<Aws4SignerRegion>Region</Aws4SignerRegion>
		</Aws>
		<!-- For Api Key (X-Pack) authentication -->
		<ApiKey>
			<!-- ApiKeyBase64 takes precedence over Id/ApiKey  -->
			<ApiKeyBase64>aWQ6YXBpa2V5</ApiKey>
			<!-- Or -->
			<Id>id</Id>
			<ApiKey>apikey</ApiKey>
		</ApiKey>
	</AuthenticationMethod>

	<!-- all filters goes in ElasticFilters tag -->
	<ElasticFilters>
		<Add>
			<Key>@type</Key>
			<Value>Special</Value>
		</Add>

		<!-- using the @type value from the previous filter -->
		<Add>
			<Key>SmartValue</Key>
			<Value>the type is %{@type}</Value>
		</Add>

		<Remove>
			<Key>@type</Key>
		</Remove>

		<!-- you can load custom filters like I do here -->
		<Filter type="log4stash.Filters.RenameKeyFilter, log4stash">
			<Key>SmartValue</Key>
			<RenameTo>SmartValue2</RenameTo>
		</Filter>

		<!-- converts a json object to fields in the document -->
		<Json>
			<SourceKey>JsonRaw</SourceKey>
			<FlattenJson>false</FlattenJson>
			<!-- the separator property is only relevant when setting the FlattenJson property to 'true' -->
			<Separator>_</Separator> 
		</Json>

		<!-- converts an xml object to fields in the document -->
		<Xml>
			<SourceKey>XmlRaw</SourceKey>
			<FlattenXml>false</FlattenXml>
		</Xml>

		<!-- kv and grok filters similar to logstash's filters -->
		<Kv>
			<SourceKey>Message</SourceKey>
			<ValueSplit>:=</ValueSplit>
			<FieldSplit> ,</FieldSplit>
		</kv>

		<Grok>
			<SourceKey>Message</SourceKey>
			<Pattern>the message is %{WORD:Message} and guid %{UUID:the_guid}</Pattern>
			<Overwrite>true</Overwrite>
		</Grok>

		<!-- Convert string like: "1,2, 45 9" into array of numbers [1,2,45,9] -->
		<ConvertToArray>
			<SourceKey>someIds</SourceKey>
			<!-- The separators (space and comma) -->
			<Seperators>, </Seperators> 
		</ConvertToArray>

		<Convert>
			<!-- convert given key to string -->
			<ToString>shouldBeString</ToString>

			<!-- same as ConvertToArray. Just for convenience -->
			<ToArray>
				<SourceKey>anotherIds</SourceKey>
			</ToArray>
		</Convert>
	</ElasticFilters>
</appender>

Note that the filters got called by the order they appeared in the config (as shown in the example).

Templates:

To get to know the ElasticSearch templates follow the link.

Sample template could be found in: log-index-spec.json. And more complex template with dynamic mappings can be found in the tests template: template.json

You can follow the link to read more about dynamic mappings.

License:

MIT License

Thanks:

Thanks to @eran-gil for helping me updating this package to support newer versions of ES, creating continous deployment and fixing issues.

Thanks to @jptoto for the idea and the first working ElasticAppender. Many thanks to @mpdreamz and the team for their great work on the NEST library! The inspiration to the filters and style had taken from elasticsearch/logstash project.

Build status:

The CI is running on Azure DevOps and tested against ElasticSearch 5,6,7 every time. Support for lower ElasticSearch versions is no longer maintained.