Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Remove depricated entry_parser from scrapeconfig #2752

Merged
merged 1 commit into from
Oct 14, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 3 additions & 7 deletions docs/sources/clients/promtail/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -259,12 +259,12 @@ backoff_config:
# Use map like {"foo": "bar"} to add a label foo with
# value bar.
# These can also be specified from command line:
# -client.external-labels=k1=v1,k2=v2
# -client.external-labels=k1=v1,k2=v2
# (or --client.external-labels depending on your OS)
# labels supplied by the command line are applied
# labels supplied by the command line are applied
# to all clients configured in the `clients` section.
# NOTE: values defined in the config file will replace values
# defined on the command line for a given client if the
# defined on the command line for a given client if the
# label keys are the same.
external_labels:
[ <labelname>: <labelvalue> ... ]
Expand Down Expand Up @@ -299,10 +299,6 @@ of targets using a specified discovery method:
# Name to identify this scrape config in the Promtail UI.
job_name: <string>

# Describes how to parse log lines. Supported values [cri docker raw]
# Deprecated in favor of pipeline_stages using the cri or docker stages.
[entry_parser: <string> | default = "docker"]

# Describes how to transform logs from targets.
[pipeline_stages: <pipeline_stages>]

Expand Down
12 changes: 6 additions & 6 deletions docs/sources/design-documents/labels.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ Our pipelined config might look like this:
```yaml
scrape_configs:
- job_name: system
entry_parsers:
pipeline_stages:
- json:
timestamp:
source: time
Expand Down Expand Up @@ -153,7 +153,7 @@ There is an alternative configuration that could be used here to accomplish the
```yaml
scrape_configs:
- job_name: system
entry_parsers:
pipeline_stages:
- json:
timestamp:
source: time
Expand Down Expand Up @@ -195,7 +195,7 @@ For example, the config above might be simplified to:
```yaml
scrape_configs:
- job_name: system
entry_parsers:
pipeline_stages:
- docker:
```

Expand All @@ -204,7 +204,7 @@ or
```yaml
scrape_configs:
- job_name: system
entry_parsers:
pipeline_stages:
- cri:
```

Expand All @@ -213,7 +213,7 @@ Which could still easily be extended to extract additional labels:
```yaml
scrape_configs:
- job_name: system
entry_parsers:
pipeline_stages:
- docker:
- regex:
expr: '.*level=(?P<level>[a-zA-Z]+).*'
Expand All @@ -227,7 +227,7 @@ An even further simplification would be to attempt to autodetect the log format,
```yaml
scrape_configs:
- job_name: system
entry_parsers:
pipeline_stages:
- auto:
```

Expand Down
104 changes: 0 additions & 104 deletions pkg/promtail/api/entry_parser.go

This file was deleted.

106 changes: 0 additions & 106 deletions pkg/promtail/api/entry_parser_test.go

This file was deleted.

2 changes: 0 additions & 2 deletions pkg/promtail/promtail_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,6 @@ import (

"github.com/grafana/loki/pkg/logentry/stages"
"github.com/grafana/loki/pkg/logproto"
"github.com/grafana/loki/pkg/promtail/api"
"github.com/grafana/loki/pkg/promtail/client"
"github.com/grafana/loki/pkg/promtail/config"
"github.com/grafana/loki/pkg/promtail/positions"
Expand Down Expand Up @@ -603,7 +602,6 @@ func buildTestConfig(t *testing.T, positionsFileName string, logDirName string)

scrapeConfig := scrapeconfig.Config{
JobName: "",
EntryParser: api.Raw,
PipelineStages: pipeline,
RelabelConfigs: nil,
ServiceDiscoveryConfig: serviceConfig,
Expand Down
8 changes: 5 additions & 3 deletions pkg/promtail/scrapeconfig/scrapeconfig.go
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,11 @@ import (
"github.com/prometheus/prometheus/pkg/relabel"

"github.com/grafana/loki/pkg/logentry/stages"
"github.com/grafana/loki/pkg/promtail/api"
)

// Config describes a job to scrape.
type Config struct {
JobName string `yaml:"job_name,omitempty"`
EntryParser api.EntryParser `yaml:"entry_parser"`
PipelineStages stages.PipelineStages `yaml:"pipeline_stages,omitempty"`
JournalConfig *JournalTargetConfig `yaml:"journal,omitempty"`
SyslogConfig *SyslogTargetConfig `yaml:"syslog,omitempty"`
Expand Down Expand Up @@ -82,7 +80,11 @@ type PushTargetConfig struct {

// DefaultScrapeConfig is the default Config.
var DefaultScrapeConfig = Config{
EntryParser: api.Docker,
PipelineStages: []interface{}{
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[q]: unfortunately coz the type of pipeline stages are just []interface{}. Would love to know if there is a better way!

map[interface{}]interface{}{
stages.StageTypeDocker: nil,
},
},
}

// HasServiceDiscoveryConfig checks to see if the service discovery used for
Expand Down
4 changes: 2 additions & 2 deletions pkg/promtail/scrapeconfig/scrapeconfig_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,15 @@ var testYaml = `
pipeline_stages:
- regex:
expr: "./*"
- json:
- json:
timestamp:
source: time
format: RFC3339
labels:
stream:
source: json_key_name.json_sub_key_name
output:
source: log
source: log
job_name: kubernetes-pods-name
kubernetes_sd_configs:
- role: pod
Expand Down
24 changes: 0 additions & 24 deletions pkg/promtail/targets/file/filetargetmanager.go
Original file line number Diff line number Diff line change
Expand Up @@ -87,30 +87,6 @@ func NewFileTargetManager(
return nil, err
}

// Backwards compatibility with old EntryParser config
if pipeline.Size() == 0 {
switch cfg.EntryParser {
case api.CRI:
level.Warn(logger).Log("msg", "WARNING!!! entry_parser config is deprecated, please change to pipeline_stages")
cri, err := stages.NewCRI(logger, registerer)
if err != nil {
return nil, err
}
pipeline.AddStage(cri)
case api.Docker:
level.Warn(logger).Log("msg", "WARNING!!! entry_parser config is deprecated, please change to pipeline_stages")
docker, err := stages.NewDocker(logger, registerer)
if err != nil {
return nil, err
}
pipeline.AddStage(docker)
case api.Raw:
level.Warn(logger).Log("msg", "WARNING!!! entry_parser config is deprecated, please change to pipeline_stages")
default:

}
}

// Add Source value to the static config target groups for unique identification
// within scrape pool. Also, default target label to localhost if target is not
// defined in promtail config.
Expand Down