-
Notifications
You must be signed in to change notification settings - Fork 401
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
feat: add http flusher supporting influxdb protocol & add group aggre…
…gator supproting group logs by keys (#521) * feat: add influxdb flusher & group aggregator * feat: http flusher support concurrency setting
- Loading branch information
Showing
43 changed files
with
2,954 additions
and
53 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
62 changes: 62 additions & 0 deletions
62
docs/cn/data-pipeline/aggregator/aggregator-content-value-group.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,62 @@ | ||
# 基础聚合 | ||
|
||
## 简介 | ||
|
||
`aggregator_content_value_group` `aggregator`插件可以实现对单条日志按照指定的 Key 进行聚合。 | ||
|
||
## 配置参数 | ||
|
||
| 参数 | 类型 | 是否必选 | 说明 | | ||
| ---------------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------------ | | ||
| Type | String | 是 | 插件类型,指定为`aggregator_content_value_group`。 | | ||
| GroupKeys | []String | 是 | 指定需要按照其值分组的Key列表 | | ||
| EnablePackID | Boolean | 否 | 是否需要在LogGroup的LogTag中添加__pack_id__字段。如果未添加改参数,则默认在LogGroup的LogTag中添加__pack_id__字段。 | | ||
| Topic | String | 否 | LogGroup的Topic名。如果未添加该参数,则默认每个LogGroup的Topic名为空。 | | ||
| ErrIfKeyNotFound | Boolean | 否 | 当指定的Key在Log的Contents中找不到时,是否打印错误日志 | | ||
|
||
## 样例 | ||
|
||
采集`/home/test-log/`路径下的所有文件名匹配`reg.log`规则的文件,使用`processor_regex`提取字段后,再按照字段`url`、`method`字段聚合,并将采集结果发送到SLS。 | ||
|
||
|
||
* 输入 | ||
|
||
```bash | ||
echo '127.0.0.1 - - [10/Aug/2017:14:57:51 +0800] "POST /PutData?Category=YunOsAccountOpLog" 0.024 18204 200 37 "-" "aliyun-sdk-java"' >> /home/test-log/reg.log | ||
``` | ||
|
||
|
||
* 采集配置 | ||
|
||
```yaml | ||
enable: true | ||
inputs: | ||
- Type: file_log | ||
LogPath: /home/test-log/ | ||
FilePattern: "reg.log" | ||
processors: | ||
- Type: processor_regex | ||
SourceKey: content | ||
Regex: ([\d\.]+) \S+ \S+ \[(\S+) \S+\] \"(\w+) ([^\\"]*)\" ([\d\.]+) (\d+) (\d+) (\d+|-) \"([^\\"]*)\" \"([^\\"]*)\" | ||
Keys: | ||
- ip | ||
- time | ||
- method | ||
- url | ||
- request_time | ||
- request_length | ||
- status | ||
- length | ||
- ref_url | ||
- browser | ||
aggregators: | ||
- Type: aggregator_content_value_group | ||
GroupKeys: | ||
- url | ||
- method | ||
flushers: | ||
- Type: flusher_sls | ||
Endpoint: cn-xxx.log.aliyuncs.com | ||
ProjectName: test_project | ||
LogstoreName: test_logstore | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,48 @@ | ||
# 标准输出/文件 | ||
|
||
## 简介 | ||
|
||
`flusher_http` `flusher`插件可以实现将采集到的数据,经过处理后,通过http格式发送到指定的地址。 | ||
|
||
## 配置参数 | ||
|
||
| 参数 | 类型 | 是否必选 | 说明 | | ||
| ---------------------------- | ------------------ | -------- | --------------------------------------------------------------------------------- | | ||
| Type | String | 是 | 插件类型,固定为`flusher_http` | | ||
| RemoteURL | String | 是 | 要发送到的URL地址,示例:`http://localhost:8086/write` | | ||
| Headers | Map<String,String> | 否 | 发送时附加的http请求header,如可添加 Authorization、Content-Type等信息 | | ||
| Query | Map<String,String> | 否 | 发送时附加到url上的query参数,支持动态变量写法,如`{"db":"%{tag.db}"}` | | ||
| Timeout | String | 否 | 请求的超时时间,默认 `60s` | | ||
| Retry.Enable | Boolean | 否 | 是否开启失败重试,默认为 `true` | | ||
| Retry.MaxRetryTimes | Int | 否 | 最大重试次数,默认为 `3` | | ||
| Retry.InitialDelay | String | 否 | 首次重试时间间隔,默认为 `1s`,重试间隔以会2的倍数递增 | | ||
| Retry.MaxDelay | String | 否 | 最大重试时间间隔,默认为 `30s` | | ||
| Convert | Struct | 否 | ilogtail数据转换协议配置 | | ||
| Convert.Protocol | String | 否 | ilogtail数据转换协议,可选值:`custom_single`,`influxdb`。默认值:`custom_single` | | ||
| Convert.Encoding | String | 否 | ilogtail flusher数据转换编码,可选值:`json`, `custom`,默认值:`json` | | ||
| Convert.TagFieldsRename | Map<String,String> | 否 | 对日志中tags中的json字段重命名 | | ||
| Convert.ProtocolFieldsRename | Map<String,String> | 否 | ilogtail日志协议字段重命名,可当前可重命名的字段:`contents`,`tags`和`time` | | ||
| Concurrency | Int | 否 | 向url发起请求的并发数,默认为`1` | | ||
|
||
|
||
|
||
## 样例 | ||
|
||
采集`/home/test-log/`路径下的所有文件名匹配`*.log`规则的文件,并将采集结果以 `custom_single` 协议、`json`格式提交到 `http://localhost:8086/write`。 | ||
|
||
``` | ||
enable: true | ||
inputs: | ||
- Type: file_log | ||
LogPath: /home/test-log/ | ||
FilePattern: "*.log" | ||
flushers: | ||
- Type: flusher_http | ||
RemoteURL: "http://localhost:8086/write" | ||
Convert: | ||
Protocol: custom_single | ||
Encoding: json | ||
``` | ||
|
||
|
||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.