You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi everyone,
One idea that I didn't see on other log collectors or stream processing products, is the idea of do anomaly detection over logs/metrics on a simple way, and in this case, on the edge.
The advantage of FluentBit in this case is that FluentBit can even create the metrics for different kind of logs by itself with Stream Processing or Log to Metrics if the original technology don't provide them.
For example, first you can create a Stream or a Logs to Metrics Filter to generate metrics from firewall logs for denied connections by IP/user/app (depends on if you are using a common firewall or a NG firewall), that aggregate that over a minute, and after that, you can use a algorithm like Random Cat Forest (RCF), https://docs.aws.amazon.com/sagemaker/latest/dg/randomcutforest.html, which is an unsupervised algorithm for detecting anomalous data points within a data set.
One example that show that you can implement this on the edge or as stream processing option is Data Prepper Anomaly Detector Processor, from OpenSearch, https://opensearch.org/docs/2.11/data-prepper/pipelines/configuration/processors/anomaly-detector/.
For me, this can be a big differential for FluentBit over other data log collectors or stream processing engines.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi everyone,
One idea that I didn't see on other log collectors or stream processing products, is the idea of do anomaly detection over logs/metrics on a simple way, and in this case, on the edge.
The advantage of FluentBit in this case is that FluentBit can even create the metrics for different kind of logs by itself with Stream Processing or Log to Metrics if the original technology don't provide them.
For example, first you can create a Stream or a Logs to Metrics Filter to generate metrics from firewall logs for denied connections by IP/user/app (depends on if you are using a common firewall or a NG firewall), that aggregate that over a minute, and after that, you can use a algorithm like Random Cat Forest (RCF), https://docs.aws.amazon.com/sagemaker/latest/dg/randomcutforest.html, which is an unsupervised algorithm for detecting anomalous data points within a data set.
One example that show that you can implement this on the edge or as stream processing option is Data Prepper Anomaly Detector Processor, from OpenSearch, https://opensearch.org/docs/2.11/data-prepper/pipelines/configuration/processors/anomaly-detector/.
For me, this can be a big differential for FluentBit over other data log collectors or stream processing engines.
Beta Was this translation helpful? Give feedback.
All reactions