Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Ingest Manager] New structure of agent configuration #19128

Merged
merged 10 commits into from
Jun 17, 2020
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ import (
)

const (
datasourcesKey = "datasources"
inputsKey = "inputs"
constraintsKey = "constraints"
validateVersionFuncName = "validate_version"
)
Expand All @@ -30,26 +30,26 @@ var (
// ConstraintFilter filters ast based on included constraints.
func ConstraintFilter(log *logger.Logger, ast *transpiler.AST) error {
// get datasources
dsNode, found := transpiler.Lookup(ast, datasourcesKey)
inputsNode, found := transpiler.Lookup(ast, inputsKey)
if !found {
return nil
}

dsListNode, ok := dsNode.Value().(*transpiler.List)
inputsListNode, ok := inputsNode.Value().(*transpiler.List)
if !ok {
return nil
}

dsList, ok := dsListNode.Value().([]transpiler.Node)
inputsList, ok := inputsListNode.Value().([]transpiler.Node)
if !ok {
return nil
}

// for each datasource
i := 0
originalLen := len(dsList)
for i < len(dsList) {
constraintMatch, err := evaluateConstraints(log, dsList[i])
originalLen := len(inputsList)
for i < len(inputsList) {
constraintMatch, err := evaluateConstraints(log, inputsList[i])
if err != nil {
return err
}
Expand All @@ -58,20 +58,20 @@ func ConstraintFilter(log *logger.Logger, ast *transpiler.AST) error {
i++
continue
}
dsList = append(dsList[:i], dsList[i+1:]...)
inputsList = append(inputsList[:i], inputsList[i+1:]...)
}

if len(dsList) == originalLen {
if len(inputsList) == originalLen {
return nil
}

// Replace datasources with limited set
if err := transpiler.RemoveKey(datasourcesKey).Apply(ast); err != nil {
if err := transpiler.RemoveKey(inputsKey).Apply(ast); err != nil {
return err
}

newList := transpiler.NewList(dsList)
return transpiler.Insert(ast, newList, datasourcesKey)
newList := transpiler.NewList(inputsList)
return transpiler.Insert(ast, newList, inputsKey)
}

func evaluateConstraints(log *logger.Logger, datasourceNode transpiler.Node) (bool, error) {
Expand Down
12 changes: 6 additions & 6 deletions x-pack/elastic-agent/pkg/agent/program/program.go
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ func groupByOutputs(single *transpiler.AST) (map[string]*transpiler.AST, error)
const (
outputsKey = "outputs"
outputKey = "output"
streamsKey = "datasources"
inputsKey = "inputs"
typeKey = "type"
)

Expand Down Expand Up @@ -168,12 +168,12 @@ func groupByOutputs(single *transpiler.AST) (map[string]*transpiler.AST, error)
clone := cloneMap(normMap)
delete(clone, outputsKey)
clone[outputKey] = map[string]interface{}{n: v}
clone[streamsKey] = make([]map[string]interface{}, 0)
clone[inputsKey] = make([]map[string]interface{}, 0)

grouped[k] = clone
}

s, ok := normMap[streamsKey]
s, ok := normMap[inputsKey]
if !ok {
s = make([]interface{}, 0)
}
Expand All @@ -199,17 +199,17 @@ func groupByOutputs(single *transpiler.AST) (map[string]*transpiler.AST, error)
return nil, fmt.Errorf("unknown configuration output with name %s", targetName)
}

streams := config[streamsKey].([]map[string]interface{})
streams := config[inputsKey].([]map[string]interface{})
streams = append(streams, stream)

config[streamsKey] = streams
config[inputsKey] = streams
grouped[targetName] = config
}

transpiled := make(map[string]*transpiler.AST)

for name, group := range grouped {
if len(group[streamsKey].([]map[string]interface{})) == 0 {
if len(group[inputsKey].([]map[string]interface{})) == 0 {
continue
}

Expand Down
2 changes: 1 addition & 1 deletion x-pack/elastic-agent/pkg/agent/program/supported.go

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@ filebeat:
paths:
- /var/log/hello1.log
- /var/log/hello2.log
dataset: generic
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't this become dataset.name instead of removing it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes this became dataset.name thing is we injected it previously which i think is incorrect as we're providing processor to enrich event. if you think this information is helpful in a final beat config i will add it back but for now i filtered it out

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Even if the Beat does not understand it today, I think in the future the Beat should so having it there already would be helpful. Can happen in a later PR.

index: logs-generic-default
processors:
- add_fields:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,23 +15,19 @@ outputs:
hosts: ["monitoring:9200"]
ca_sha256: "7lHLiyp4J8m9kw38SJ7SURJP4bXRZv/BNxyyXkCcE/M="

datasources:
- use_output: default
inputs:
- type: logs
streams:
- paths:
- /var/log/hello1.log
- /var/log/hello2.log
- namespace: testing
use_output: default
inputs:
- type: logs
streams:
- paths:
- /var/log/hello1.log
- /var/log/hello2.log
- type: apache/metrics
constraints:
- "validate_version(%{[agent.version]}, '1.0.0 - 7.0.0')"
inputs:
- type: apache/metrics
streams:
- enabled: true
metricset: info
dataset.namespace: testing
streams:
- enabled: true
metricset: info

settings.monitoring:
use_output: monitoring
Expand Down
Original file line number Diff line number Diff line change
@@ -1,12 +1,10 @@
datasources:
- use_output: default
inputs:
- type: event/file
streams:
- enabled: false
paths:
- var/log/hello1.log
- var/log/hello2.log
inputs:
- type: event/file
streams:
- enabled: false
paths:
- var/log/hello1.log
- var/log/hello2.log
management:
host: "localhost"
config:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,11 +1,9 @@
datasources:
- use_output: default
inputs:
- type: event/file
streams:
- paths:
- /var/log/hello1.log
- /var/log/hello2.log
inputs:
- type: event/file
streams:
- paths:
- /var/log/hello1.log
- /var/log/hello2.log
management:
host: "localhost"
config:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@ filebeat:
paths:
- /var/log/hello1.log
- /var/log/hello2.log
dataset: generic
index: logs-generic-default
processors:
- add_fields:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,11 +1,9 @@
datasources:
- use_output: default
inputs:
- type: event/file
streams:
- paths:
- /var/log/hello1.log
- /var/log/hello2.log
inputs:
- type: event/file
streams:
- paths:
- /var/log/hello1.log
- /var/log/hello2.log
management:
host: "localhost"
config:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@ filebeat:
paths:
- /var/log/hello1.log
- /var/log/hello2.log
dataset: generic
index: logs-generic-default
processors:
- add_fields:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,13 @@ fleet:
kibana_url: https://kibana.mydomain.com:5601
ca_hash: 7HIpactkIAq2Y49orFOOQKurWxmmSFZhBCoQYcRhJ3Y=
checkin_interval: 5m
datasources:
- use_output: default
inputs:
- type: event/file
streams:
- enabled: true
paths:
- /var/log/hello1.log
- /var/log/hello2.log
inputs:
- type: event/file
streams:
- enabled: true
paths:
- /var/log/hello1.log
- /var/log/hello2.log
management:
host: "localhost"
config:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,26 @@ filebeat:
- /var/log/hello1.log
- /var/log/hello2.log
index: logs-generic-default
dataset: generic
vars:
var: value
processors:
- add_fields:
target: "dataset"
fields:
type: logs
name: generic
namespace: default
- add_fields:
target: "stream"
fields:
type: logs
dataset: generic
namespace: default
- type: log
paths:
- /var/log/hello3.log
- /var/log/hello4.log
index: testtype-generic-default
vars:
var: value
processors:
Expand Down
68 changes: 38 additions & 30 deletions x-pack/elastic-agent/pkg/agent/program/testdata/single_config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,37 +19,45 @@ outputs:
hosts: ["monitoring:9200"]
ca_sha256: "7lHLiyp4J8m9kw38SJ7SURJP4bXRZv/BNxyyXkCcE/M="

datasources:
- use_output: default
inputs:
- type: docker/metrics
streams:
- metricset: status
dataset: docker.status
- metricset: info
dataset: ""
hosts: ["http://127.0.0.1:8080"]
- type: logs
streams:
- paths:
- /var/log/hello1.log
- /var/log/hello2.log
vars:
var: value
- namespace: testing
inputs:
- type: docker/metrics
use_output: default
inputs:
- type: apache/metrics
processors:
- add_fields:
fields:
should_be: first
streams:
- enabled: true
metricset: info
hosts: ["http://apache.remote"]
hosts: ["http://apache.local"]
id: apache-metrics-id
streams:
- metricset: status
dataset.name: docker.status
- metricset: info
dataset.name: ""
hosts: ["http://127.0.0.1:8080"]
- type: logs
use_output: default
streams:
- paths:
- /var/log/hello1.log
- /var/log/hello2.log
vars:
var: value
- type: logs
dataset.type: testtype
use_output: default
streams:
- paths:
- /var/log/hello3.log
- /var/log/hello4.log
vars:
var: value
- type: apache/metrics
dataset.namespace: testing
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably good to add one of these tests files to use:

dataset:
  namespace: testing

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you mean with invalid key?
keys we dont know are passed to input by default

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No I mean add a test to ensure that dataset.namespace: testing or

dataset:
  namespace: testing

Both work

use_output: default
processors:
- add_fields:
fields:
should_be: first
streams:
- enabled: true
metricset: info
hosts: ["http://apache.remote"]
hosts: ["http://apache.local"]
id: apache-metrics-id

settings.monitoring:
use_output: monitoring
Expand Down
Loading