Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New install cannot authenticate, Compose keeps quotes in passwords read from the .env file #677

Closed
mAineAc- opened this issue Feb 26, 2022 · 16 comments

Comments

@mAineAc-
Copy link

mAineAc- commented Feb 26, 2022

Problem description

I run docker compose everything completes. I check the logs to verify set up is complete. I can go to the ,env files and see the password for elastic is changeme and when I go to log in on the page it tells me the password is invalid. I have cleaned out all images for all docker sessions and started right from scratch and still get the same thing. I can go into the elastic config and change to no security and it brings it up so that i can do stuff but I don't see a way to create new users when I do this. I have tried changing the default password to other passwords to no avail. I still get an authentication failed when trying to log in.

Extra information

Stack configuration

maineac@elastiflow:~/docker/docker-elk$ git diff
diff --git a/elasticsearch/config/elasticsearch.yml b/elasticsearch/config/elasticsearch.yml
index 3e82379..f03613b 100644
--- a/elasticsearch/config/elasticsearch.yml
+++ b/elasticsearch/config/elasticsearch.yml
@@ -8,5 +8,5 @@ network.host: 0.0.0.0

X-Pack settings

see https://www.elastic.co/guide/en/elasticsearch/reference/current/security-settings.html

-xpack.license.self_generated.type: trial
+xpack.license.self_generated.type: basic
xpack.security.enabled: true
maineac@elastiflow:~/docker/docker-elk$

Docker setup

$ docker version

maineac@elastiflow:~/docker/docker-elk$ sudo docker version
[sudo] password for maineac:
Client:
 Version:           20.10.7
 API version:       1.41
 Go version:        go1.13.8
 Git commit:        20.10.7-0ubuntu5~20.04.2
 Built:             Mon Nov  1 00:34:17 2021
 OS/Arch:           linux/amd64
 Context:           default
 Experimental:      true

Server:
 Engine:
  Version:          20.10.7
  API version:      1.41 (minimum version 1.12)
  Go version:       go1.13.8
  Git commit:       20.10.7-0ubuntu5~20.04.2
  Built:            Fri Oct 22 00:45:53 2021
  OS/Arch:          linux/amd64
  Experimental:     false
 containerd:
  Version:          1.5.5-0ubuntu3~20.04.1
  GitCommit:
 runc:
  Version:          1.0.1-0ubuntu2~20.04.1
  GitCommit:
 docker-init:
  Version:          0.19.0
  GitCommit:
$ docker-compose version

maineac@elastiflow:~/docker/docker-elk$ sudo docker-compose version
docker-compose version 1.25.0, build unknown
docker-py version: 4.1.0
CPython version: 3.8.10
OpenSSL version: OpenSSL 1.1.1f  31 Mar 2020
maineac@elastiflow:~/docker/docker-elk$

Container logs

$ docker-compose logs

maineac@elastiflow:~/docker/docker-elk$ sudo docker-compose logs
Attaching to docker-elk_logstash_1, docker-elk_kibana_1, docker-elk_setup_1, docker-elk_elasticsearch_1
kibana_1         | [2022-02-25T00:10:30.146+00:00][INFO ][plugins-service] Plugin "metricsEntities" is disabled.
kibana_1         | [2022-02-25T00:10:30.309+00:00][INFO ][http.server.Preboot] http server running at http://0.0.0.0:5601
kibana_1         | [2022-02-25T00:10:30.471+00:00][INFO ][plugins-system.preboot] Setting up [1] plugins: [interactiveSetup]
kibana_1         | [2022-02-25T00:10:30.588+00:00][WARN ][config.deprecation] The default mechanism for Reporting privileges will work differently in future versions, which will affect the behavior of this cluster. Set "xpack.reporting.roles.enabled" to "false" to adopt the future behavior before upgrading.
kibana_1         | [2022-02-25T00:10:31.083+00:00][INFO ][plugins-system.standard] Setting up [107] plugins: [translations,licensing,globalSearch,globalSearchProviders,features,licenseApiGuard,usageCollection,taskManager,telemetryCollectionManager,telemetryCollectionXpack,kibanaUsageCollection,share,embeddable,uiActionsEnhanced,screenshotMode,screenshotting,banners,telemetry,newsfeed,mapsEms,fieldFormats,expressions,dataViews,charts,esUiShared,bfetch,data,savedObjects,presentationUtil,expressionShape,expressionRevealImage,expressionRepeatImage,expressionMetric,expressionImage,customIntegrations,home,searchprofiler,painlessLab,grokdebugger,management,watcher,licenseManagement,advancedSettings,spaces,security,savedObjectsTagging,reporting,lists,ingestPipelines,fileUpload,encryptedSavedObjects,dataEnhanced,cloud,snapshotRestore,eventLog,actions,alerting,triggersActionsUi,transform,stackAlerts,ruleRegistry,visualizations,canvas,visTypeXy,visTypeVislib,visTypeVega,visTypeTimelion,visTypeTagcloud,visTypeTable,visTypePie,visTypeMetric,visTypeMarkdown,expressionTagcloud,expressionMetricVis,savedObjectsManagement,console,graph,fleet,indexManagement,remoteClusters,crossClusterReplication,indexLifecycleManagement,dashboard,maps,dashboardEnhanced,visualize,visTypeTimeseries,rollup,indexPatternFieldEditor,lens,cases,timelines,discover,osquery,observability,discoverEnhanced,dataVisualizer,ml,uptime,securitySolution,infra,upgradeAssistant,monitoring,logstash,enterpriseSearch,apm,indexPatternManagement]
kibana_1         | [2022-02-25T00:10:31.120+00:00][INFO ][plugins.taskManager] TaskManager is identified by the Kibana UUID: c5c26ebe-6709-4fc3-ade2-335596339e75
kibana_1         | [2022-02-25T00:10:31.376+00:00][WARN ][plugins.security.config] Generating a random key for xpack.security.encryptionKey. To prevent sessions from being invalidated on restart, please set xpack.security.encryptionKey in the kibana.yml or use the bin/kibana-encryption-keys command.
kibana_1         | [2022-02-25T00:10:31.377+00:00][WARN ][plugins.security.config] Session cookies will be transmitted over insecure connections. This is not recommended.
kibana_1         | [2022-02-25T00:10:31.423+00:00][WARN ][plugins.security.config] Generating a random key for xpack.security.encryptionKey. To prevent sessions from being invalidated on restart, please set xpack.security.encryptionKey in the kibana.yml or use the bin/kibana-encryption-keys command.
kibana_1         | [2022-02-25T00:10:31.424+00:00][WARN ][plugins.security.config] Session cookies will be transmitted over insecure connections. This is not recommended.
kibana_1         | [2022-02-25T00:10:31.455+00:00][WARN ][plugins.reporting.config] Generating a random key for xpack.reporting.encryptionKey. To prevent sessions from being invalidated on restart, please set xpack.reporting.encryptionKey in the kibana.yml or use the bin/kibana-encryption-keys command.
kibana_1         | [2022-02-25T00:10:31.457+00:00][WARN ][plugins.reporting.config] Found 'server.host: "0.0.0.0"' in Kibana configuration. Reporting is not able to use this as the Kibana server hostname. To enable PNG/PDF Reporting to work, 'xpack.reporting.kibanaServer.hostname: localhost' is automatically set in the configuration. You can prevent this message by adding 'xpack.reporting.kibanaServer.hostname: localhost' in kibana.yml.
kibana_1         | [2022-02-25T00:10:31.474+00:00][WARN ][plugins.encryptedSavedObjects] Saved objects encryption key is not set. This will severely limit Kibana functionality. Please set xpack.encryptedSavedObjects.encryptionKey in the kibana.yml or use the bin/kibana-encryption-keys command.
kibana_1         | [2022-02-25T00:10:31.501+00:00][WARN ][plugins.actions] APIs are disabled because the Encrypted Saved Objects plugin is missing encryption key. Please set xpack.encryptedSavedObjects.encryptionKey in the kibana.yml or use the bin/kibana-encryption-keys command.
kibana_1         | [2022-02-25T00:10:31.529+00:00][WARN ][plugins.alerting] APIs are disabled because the Encrypted Saved Objects plugin is missing encryption key. Please set xpack.encryptedSavedObjects.encryptionKey in the kibana.yml or use the bin/kibana-encryption-keys command.
kibana_1         | [2022-02-25T00:10:31.568+00:00][INFO ][plugins.ruleRegistry] Installing common resources shared between all indices
kibana_1         | [2022-02-25T00:10:32.750+00:00][INFO ][plugins.screenshotting.config] Chromium sandbox provides an additional layer of protection, and is supported for Linux Ubuntu 20.04 OS. Automatically enabling Chromium sandbox.
kibana_1         | [2022-02-25T00:10:35.727+00:00][ERROR][elasticsearch-service] Unable to retrieve version information from Elasticsearch nodes. connect ECONNREFUSED 172.28.0.2:9200
kibana_1         | [2022-02-25T00:10:37.918+00:00][INFO ][plugins.screenshotting.chromium] Browser executable: /usr/share/kibana/x-pack/plugins/screenshotting/chromium/headless_shell-linux_x64/headless_shell
kibana_1         | [2022-02-25T00:10:44.350+00:00][ERROR][elasticsearch-service] Unable to retrieve version information from Elasticsearch nodes. security_exception: [security_exception] Reason: unable to authenticate user [kibana_system] for REST request [/_nodes?filter_path=nodes.*.version%2Cnodes.*.http.publish_address%2Cnodes.*.ip]
kibana_1         | [2022-02-25T00:10:50.557+00:00][INFO ][savedobjects-service] Waiting until all Elasticsearch nodes are compatible with Kibana before starting saved objects migrations...
kibana_1         | [2022-02-25T00:10:50.557+00:00][INFO ][savedobjects-service] Starting saved objects migrations
kibana_1         | [2022-02-25T00:10:51.602+00:00][INFO ][savedobjects-service] [.kibana] INIT -> CREATE_NEW_TARGET. took: 38ms.
kibana_1         | [2022-02-25T00:10:51.637+00:00][INFO ][savedobjects-service] [.kibana_task_manager] INIT -> CREATE_NEW_TARGET. took: 71ms.
kibana_1         | [2022-02-25T00:10:51.943+00:00][INFO ][savedobjects-service] [.kibana] CREATE_NEW_TARGET -> MARK_VERSION_INDEX_READY. took: 341ms.
kibana_1         | [2022-02-25T00:10:51.947+00:00][INFO ][savedobjects-service] [.kibana_task_manager] CREATE_NEW_TARGET -> MARK_VERSION_INDEX_READY. took: 310ms.
kibana_1         | [2022-02-25T00:10:52.058+00:00][INFO ][savedobjects-service] [.kibana] MARK_VERSION_INDEX_READY -> DONE. took: 115ms.
kibana_1         | [2022-02-25T00:10:52.058+00:00][INFO ][savedobjects-service] [.kibana] Migration completed after 494ms
kibana_1         | [2022-02-25T00:10:52.112+00:00][INFO ][savedobjects-service] [.kibana_task_manager] MARK_VERSION_INDEX_READY -> DONE. took: 165ms.
kibana_1         | [2022-02-25T00:10:52.112+00:00][INFO ][savedobjects-service] [.kibana_task_manager] Migration completed after 546ms
kibana_1         | [2022-02-25T00:10:52.593+00:00][INFO ][plugins-system.preboot] Stopping all plugins.
kibana_1         | [2022-02-25T00:10:52.595+00:00][INFO ][plugins-system.standard] Starting [107] plugins: [translations,licensing,globalSearch,globalSearchProviders,features,licenseApiGuard,usageCollection,taskManager,telemetryCollectionManager,telemetryCollectionXpack,kibanaUsageCollection,share,embeddable,uiActionsEnhanced,screenshotMode,screenshotting,banners,telemetry,newsfeed,mapsEms,fieldFormats,expressions,dataViews,charts,esUiShared,bfetch,data,savedObjects,presentationUtil,expressionShape,expressionRevealImage,expressionRepeatImage,expressionMetric,expressionImage,customIntegrations,home,searchprofiler,painlessLab,grokdebugger,management,watcher,licenseManagement,advancedSettings,spaces,security,savedObjectsTagging,reporting,lists,ingestPipelines,fileUpload,encryptedSavedObjects,dataEnhanced,cloud,snapshotRestore,eventLog,actions,alerting,triggersActionsUi,transform,stackAlerts,ruleRegistry,visualizations,canvas,visTypeXy,visTypeVislib,visTypeVega,visTypeTimelion,visTypeTagcloud,visTypeTable,visTypePie,visTypeMetric,visTypeMarkdown,expressionTagcloud,expressionMetricVis,savedObjectsManagement,console,graph,fleet,indexManagement,remoteClusters,crossClusterReplication,indexLifecycleManagement,dashboard,maps,dashboardEnhanced,visualize,visTypeTimeseries,rollup,indexPatternFieldEditor,lens,cases,timelines,discover,osquery,observability,discoverEnhanced,dataVisualizer,ml,uptime,securitySolution,infra,upgradeAssistant,monitoring,logstash,enterpriseSearch,apm,indexPatternManagement]
kibana_1         | [2022-02-25T00:10:55.134+00:00][INFO ][plugins.fleet] Beginning fleet setup
kibana_1         | [2022-02-25T00:10:55.168+00:00][INFO ][plugins.monitoring.monitoring] config sourced from: production cluster
kibana_1         | [2022-02-25T00:10:58.732+00:00][INFO ][http.server.Kibana] http server running at http://0.0.0.0:5601
kibana_1         | [2022-02-25T00:10:59.290+00:00][INFO ][plugins.monitoring.monitoring.kibana-monitoring] Starting monitoring stats collection
kibana_1         | [2022-02-25T00:11:00.708+00:00][INFO ][status] Kibana is now degraded
kibana_1         | [2022-02-25T00:11:00.977+00:00][INFO ][plugins.ruleRegistry] Installed common resources shared between all indices
kibana_1         | [2022-02-25T00:11:00.978+00:00][INFO ][plugins.ruleRegistry] Installing resources for index .alerts-observability.uptime.alerts
kibana_1         | [2022-02-25T00:11:00.979+00:00][INFO ][plugins.ruleRegistry] Installing resources for index .alerts-security.alerts
kibana_1         | [2022-02-25T00:11:00.979+00:00][INFO ][plugins.ruleRegistry] Installing resources for index .preview.alerts-security.alerts
kibana_1         | [2022-02-25T00:11:00.979+00:00][INFO ][plugins.ruleRegistry] Installing resources for index .alerts-observability.logs.alerts
kibana_1         | [2022-02-25T00:11:00.980+00:00][INFO ][plugins.ruleRegistry] Installing resources for index .alerts-observability.metrics.alerts
kibana_1         | [2022-02-25T00:11:00.980+00:00][INFO ][plugins.ruleRegistry] Installing resources for index .alerts-observability.apm.alerts
kibana_1         | [2022-02-25T00:11:01.140+00:00][INFO ][plugins.ruleRegistry] Installed resources for index .alerts-observability.logs.alerts
kibana_1         | [2022-02-25T00:11:01.209+00:00][INFO ][plugins.ruleRegistry] Installed resources for index .alerts-observability.metrics.alerts
kibana_1         | [2022-02-25T00:11:01.313+00:00][INFO ][plugins.ruleRegistry] Installed resources for index .alerts-observability.uptime.alerts
kibana_1         | [2022-02-25T00:11:01.442+00:00][INFO ][plugins.ruleRegistry] Installed resources for index .alerts-observability.apm.alerts
kibana_1         | [2022-02-25T00:11:01.521+00:00][INFO ][plugins.ruleRegistry] Installed resources for index .alerts-security.alerts
kibana_1         | [2022-02-25T00:11:02.481+00:00][INFO ][plugins.ruleRegistry] Installed resources for index .preview.alerts-security.alerts
kibana_1         | [2022-02-25T00:11:02.771+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-25T00:11:05.242+00:00][INFO ][status] Kibana is now available (was degraded)
kibana_1         | [2022-02-25T00:11:05.273+00:00][INFO ][plugins.reporting.store] Creating ILM policy for managing reporting indices: kibana-reporting
kibana_1         | [2022-02-25T00:11:39.727+00:00][INFO ][plugins.fleet] Fleet setup completed
kibana_1         | [2022-02-25T00:11:39.739+00:00][INFO ][plugins.securitySolution] Dependent plugin setup complete - Starting ManifestTask
kibana_1         | [2022-02-25T00:18:50.305+00:00][INFO ][plugins.security.routes] Logging in with provider "basic" (basic)
kibana_1         | [2022-02-25T00:19:12.805+00:00][INFO ][plugins.security.routes] Logging in with provider "basic" (basic)
kibana_1         | [2022-02-25T00:20:05.378+00:00][INFO ][plugins.security.routes] Logging in with provider "basic" (basic)
kibana_1         | [2022-02-25T00:20:06.697+00:00][INFO ][plugins.security.routes] Logging in with provider "basic" (basic)
kibana_1         | [2022-02-25T00:20:14.409+00:00][INFO ][plugins.security.routes] Logging in with provider "basic" (basic)
kibana_1         | [2022-02-25T00:20:28.769+00:00][INFO ][plugins.security.routes] Logging in with provider "basic" (basic)
kibana_1         | [2022-02-25T00:20:32.313+00:00][INFO ][plugins.security.routes] Logging in with provider "basic" (basic)
kibana_1         | [2022-02-25T02:11:04.692+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-25T04:11:05.138+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-25T06:11:05.598+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-25T08:11:06.094+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-25T10:11:06.549+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-25T12:11:07.028+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-25T14:11:07.473+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-25T16:11:07.942+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-25T18:11:08.424+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-25T20:11:08.852+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-25T22:11:09.335+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-26T00:11:09.774+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-26T02:11:10.219+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-26T04:11:10.723+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-26T06:11:11.191+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-26T08:11:11.699+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-26T10:11:12.125+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-26T12:11:12.573+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
kibana_1         | [2022-02-26T14:11:13.078+00:00][INFO ][plugins.securitySolution.endpoint:metadata-check-transforms-task:0.0.1] no endpoint metadata transforms found
logstash_1       | Using bundled JDK: /usr/share/logstash/jdk
logstash_1       | OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
logstash_1       | Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
logstash_1       | [2022-02-25T00:10:40,161][INFO ][logstash.runner          ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
logstash_1       | [2022-02-25T00:10:40,173][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.0.0", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.13+8 on 11.0.13+8 +indy +jit [linux-x86_64]"}
logstash_1       | [2022-02-25T00:10:40,176][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dls.cgroup.cpuacct.path.override=/, -Dls.cgroup.cpu.path.override=/, -Xmx256m, -Xms256m]
logstash_1       | [2022-02-25T00:10:40,217][INFO ][logstash.settings        ] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
logstash_1       | [2022-02-25T00:10:40,231][INFO ][logstash.settings        ] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
logstash_1       | [2022-02-25T00:10:40,732][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"b5e50944-be28-45d7-a3ae-e717e78847bc", :path=>"/usr/share/logstash/data/uuid"}
logstash_1       | [2022-02-25T00:10:42,668][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
logstash_1       | [2022-02-25T00:10:44,494][INFO ][org.reflections.Reflections] Reflections took 107 ms to scan 1 urls, producing 120 keys and 417 values
logstash_1       | [2022-02-25T00:10:46,045][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
logstash_1       | [2022-02-25T00:10:46,171][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch:9200"]}
logstash_1       | [2022-02-25T00:10:47,015][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://logstash_internal:xxxxxx@elasticsearch:9200/]}}
logstash_1       | [2022-02-25T00:10:47,657][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://logstash_internal:xxxxxx@elasticsearch:9200/"}
logstash_1       | [2022-02-25T00:10:47,679][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.0.0) {:es_version=>8}
logstash_1       | [2022-02-25T00:10:47,682][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
logstash_1       | [2022-02-25T00:10:47,801][INFO ][logstash.outputs.elasticsearch][main] Config is compliant with data streams. `data_stream => auto` resolved to `true`
logstash_1       | [2022-02-25T00:10:47,806][INFO ][logstash.outputs.elasticsearch][main] Config is compliant with data streams. `data_stream => auto` resolved to `true`
logstash_1       | [2022-02-25T00:10:47,811][WARN ][logstash.outputs.elasticsearch][main] Elasticsearch Output configured with `ecs_compatibility => v8`, which resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Common Schema. Once ECS v8 and an updated release of this plugin are publicly available, you will need to update this plugin to resolve this warning.
logstash_1       | [2022-02-25T00:10:47,895][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
logstash_1       | [2022-02-25T00:10:47,940][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x13382a40 run>"}
logstash_1       | [2022-02-25T00:10:48,034][INFO ][logstash.outputs.elasticsearch][main] Installing Elasticsearch template {:name=>"ecs-logstash"}
logstash_1       | [2022-02-25T00:10:49,150][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.2}
logstash_1       | [2022-02-25T00:10:49,297][INFO ][logstash.inputs.beats    ][main] Starting input listener {:address=>"0.0.0.0:5044"}
logstash_1       | [2022-02-25T00:10:49,548][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
logstash_1       | [2022-02-25T00:10:49,572][INFO ][logstash.inputs.tcp      ][main][c15c0e9d24530a53e2530254c037deff8c49a2140c6b89c577cbcde717c01af0] Starting tcp input listener {:address=>"0.0.0.0:5000", :ssl_enable=>false}
logstash_1       | [2022-02-25T00:10:49,650][INFO ][org.logstash.beats.Server][main][27dfbb7f38dfc5bffbd7ce11deb78daf39a131fcbb4eceab2c6a1d7100069751] Starting server on port: 5044
logstash_1       | [2022-02-25T00:10:49,711][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
setup_1          | -------- Fri Feb 25 00:10:03 UTC 2022 --------
setup_1          | [+] Waiting for availability of Elasticsearch
setup_1          |    ⠿ Elasticsearch is running
setup_1          | [+] Role 'logstash_writer'
setup_1          |    ⠿ Creating/updating
setup_1          | [+] User 'kibana_system'
setup_1          |    ⠿ User exists, setting password
setup_1          | [+] User 'logstash_internal'
setup_1          |    ⠿ User does not exist, creating
elasticsearch_1  | Created elasticsearch keystore in /usr/share/elasticsearch/config/elasticsearch.keystore
elasticsearch_1  | {"@timestamp":"2022-02-25T00:10:21.919Z", "log.level": "INFO", "message":"version[8.0.0], pid[7], build[default/docker/1b6a7ece17463df5ff54a3e1302d825889aa1161/2022-02-03T16:47:57.507843096Z], OS[Linux/5.4.0-100-generic/amd64], JVM[Eclipse Adoptium/OpenJDK 64-Bit Server VM/17.0.1/17.0.1+12]", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"main","log.logger":"org.elasticsearch.node.Node","elasticsearch.node.name":"4c280bf74c9b","elasticsearch.cluster.name":"docker-cluster"}
elasticsearch_1  | {"@timestamp":"2022-02-25T00:10:21.927Z", "log.level": "INFO", "message":"JVM home [/usr/share/elasticsearch/jdk], using bundled JDK [true]", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"main","log.logger":"org.elasticsearch.node.Node","elasticsearch.node.name":"4c280bf74c9b","elasticsearch.cluster.name":"docker-cluster"}
elasticsearch_1  | {"@timestamp":"2022-02-25T00:10:21.927Z", "log.level": "INFO", "message":"JVM arguments [-Xshare:auto, -Des.networkaddress.cache.ttl=60, -Des.networkaddress.cache.negative.ttl=10, -Djava.security.manager=allow, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -XX:-OmitStackTraceInFastThrow, -XX:+ShowCodeDetailsInExceptionMessages, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dio.netty.allocator.numDirectArenas=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Dlog4j2.formatMsgNoLookups=true, -Djava.locale.providers=SPI,COMPAT, --add-opens=java.base/java.io=ALL-UNNAMED, -XX:+UseG1GC, -Djava.io.tmpdir=/tmp/elasticsearch-17864427426443201495, -XX:+HeapDumpOnOutOfMemoryError, -XX:+ExitOnOutOfMemoryError, -XX:HeapDumpPath=data, -XX:ErrorFile=logs/hs_err_pid%p.log, -Xlog:gc*,gc+age=trace,safepoint:file=logs/gc.log:utctime,pid,tags:filecount=32,filesize=64m, -Des.cgroups.hierarchy.override=/, -Xmx256m, -Xms256m, -XX:MaxDirectMemorySize=134217728, -XX:G1HeapRegionSize=4m, -XX:InitiatingHeapOccupancyPercent=30, -XX:G1ReservePercent=15, -Des.path.home=/usr/share/elasticsearch, -Des.path.conf=/usr/share/elasticsearch/config, -Des.distribution.flavor=default, -Des.distribution.type=docker, -Des.bundled_jdk=true]", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"main","log.logger":"org.elasticsearch.node.Node","elasticsearch.node.name":"4c280bf74c9b","elasticsearch.cluster.name":"docker-cluster"}
elasticsearch_1  | {"@timestamp":"2022-02-25T00:10:25.331Z", "log.level": "WARN", "message":"SLF4J: Failed to load class \"org.slf4j.impl.StaticLoggerBinder\".", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"main","log.logger":"stderr","elasticsearch.node.name":"4c280bf74c9b","elasticsearch.cluster.name":"docker-cluster"}
elasticsearch_1  | {"@timestamp":"2022-02-25T00:10:25.333Z", "log.level": "WARN", "message":"SLF4J: Defaulting to no-operation (NOP) logger implementation", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"main","log.logger":"stderr","elasticsearch.node.name":"4c280bf74c9b","elasticsearch.cluster.name":"docker-cluster"}
elasticsearch_1  | {"@timestamp":"2022-02-25T00:10:25.333Z", "log.level": "WARN", "message":"SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"main","log.logger":"stderr","elasticsearch.node.name":"4c280bf74c9b","elasticsearch.cluster.name":"docker-cluster"}
elasticsearch_1  | {"@timestamp":"2022-02-25T00:10:27.457Z", "log.level": "INFO", "message":"loaded module [aggs-matrix-stats]", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"main","log.logger":"org.elasticsearch.plugins.PluginsService","elasticsearch.node.name":"4c280bf74c9b","elasticsearch.cluster.name":"docker-cluster"}
elasticsearch_1  | {"@timestamp":"2022-02-25T00:10:27.457Z", "log.level": "INFO", "message":"loaded module [analysis-common]", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"main","log.logger":"org.elasticsearch.plugins.PluginsService","elasticsearch.node.name":"4c280bf74c9b","elasticsearch.cluster.name":"docker-cluster"}
elasticsearch_1  | {"@timestamp":"2022-02-25T00:10:27.458Z", "log.level": "INFO", "message":"loaded module [constant-keyword]", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"main","log.logger":"org.elasticsearch.plugins.PluginsService","elasticsearch.node.name":"4c280bf74c9b","elasticsearch.cluster.name":"docker-cluster"}
@antoineco
Copy link
Collaborator

antoineco commented Feb 26, 2022

@mAineAc- I see that the setup container managed to create its roles and users using the ELASTIC_PASSWORD, which means the elastic:changeme credentials are valid.

In the logs you provided, it seems like Elasticsearch hasn't finished initializing yet. My first recommendation would be to tail the logs of both Elasticsearch and Kibana, and ensure that everything is really up and running before trying to log in.


Please also note that changing the passwords in the .env file does not update the passwords of Elasticsearch's users, it simply configures the clients (Kibana and Logstash) to use those passwords to connect to Elasticsearch.

Users are only initialized from the .env file on one occasion: during the initial startup of the stack. To reset the stack and all its data including passwords, you need to run docker-compose down -v. Upon the next up, the stack will start fresh and all users will be re-initialized from the .env file.

@mAineAc-
Copy link
Author

Yes, I have brought it up and down repeatedly. It never works. I was limited on how much I can put in for logs so I cut it short. I have left it up for over a day and it still has a log in failure.

@antoineco
Copy link
Collaborator

It could be a browser issue then (privacy add-ons?), in which case it might come in handy to try in private mode, or with a different browser. Opening the dev console might reveal errors too.

I see no evidence that the stack components themselves aren't working at least. Kibana is connected to Elasticsearch since it shows a login screen, and the setup container completed.

We have automated tests to ensure that we don't break the stack when we push changes, so I'm confident Kibana is working with the aforementioned user.
Like I said, I would look into the browser's console, make sure it doesn't have cached data for localhost:5601, etc.

@mAineAc-
Copy link
Author

I will test and get back to you.

@mAineAc-
Copy link
Author

elastic

Tested in new browser and in incognito mode

@mAineAc-
Copy link
Author

elastic-console

This is the browser console after a login attempt

@VN-CERT
Copy link

VN-CERT commented Feb 28, 2022

password is 'changeme'
Copy the whole phrase, including the '

@antoineco
Copy link
Collaborator

@VNPT-HCM wow I think you're right. From the docs:

Syntax rules

The following syntax rules apply to the .env file:

  • Compose expects each line in an env file to be in VAR=VAL format.
    [...]
  • There is no special handling of quotation marks. This means that they are part of the VAL.

I was going to say that I swear I managed to log in with "changeme" (no quotes), but I think I know why: I typically export my variables in the shell (export ELASTIC_PASSWORD='something'), and in that case the quotes are interpreted.

@mAineAc- could you please confirm that this was indeed the issue before I push the fix?

@antoineco antoineco added the bug label Feb 28, 2022
@antoineco
Copy link
Collaborator

Actually, I have a feeling that the fact I did not catch this might be related to Compose V2.

I'm using Compose V2, and the quotes are definitely not part of the password.

Without quotes:

$  curl -D- http://localhost:9200 -u "elastic:changeme"
HTTP/1.1 200 OK
...

With quotes:

$  curl -D- http://localhost:9200 -u "elastic:'changeme'"
HTTP/1.1 401 Unauthorized
...

@antoineco
Copy link
Collaborator

antoineco commented Feb 28, 2022

Update: same result with Compose V1 (1.29) 🤔

$ docker-compose-v1 version
docker-compose version 1.29.2, build 5becea4c
docker-py version: 5.0.0
CPython version: 3.7.10
OpenSSL version: OpenSSL 1.1.0l  10 Sep 2019
$ docker-compose-v1 exec elasticsearch printenv ELASTIC_PASSWORD
changeme

Without quotes:

$  curl -D- http://localhost:9200 -u "elastic:changeme"
HTTP/1.1 200 OK
...

With quotes:

$  curl -D- http://localhost:9200 -u "elastic:'changeme'"
HTTP/1.1 401 Unauthorized
...

I see that @mAineAc- is using Compose 1.25.0.
In the changelog for Compose 1.26.0, I see the following entry:

Added python-dotenv to delegate .env file processing.

The introduction of this library may have had an influence on the handling of quotes. (edit: apparently it did: docker/compose#2854)

However, it is unclear to me why to documentation mentions there is no special handling of quotation marks, when obviously there is. We might want to report this to Docker. edit: I just opened docker/docs#14318

@antoineco
Copy link
Collaborator

@VNPT-HCM and @mAineAc-, I'd be curious to see what the output of docker-compose config is in your environment?

@mAineAc-
Copy link
Author

Using the quotes did work. I got logged in with that.

@mAineAc-
Copy link
Author

mAineAc- commented Feb 28, 2022

maineac@elastiflow:~/docker/docker-elk$ docker-compose config
networks:
  elk:
    driver: bridge
services:
  elasticsearch:
    build:
      args:
        ELK_VERSION: 8.0.0
      context: /home/maineac/docker/docker-elk/elasticsearch
    environment:
      ELASTIC_PASSWORD: '''changeme'''
      ES_JAVA_OPTS: -Xmx256m -Xms256m
      discovery.type: single-node
    networks:
      elk: null
    ports:
    - published: 9200
      target: 9200
    - published: 9300
      target: 9300
    volumes:
    - /home/maineac/docker/docker-elk/elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml:ro,z
    - elasticsearch:/usr/share/elasticsearch/data:z
  kibana:
    build:
      args:
        ELK_VERSION: 8.0.0
      context: /home/maineac/docker/docker-elk/kibana
    depends_on:
    - elasticsearch
    environment:
      KIBANA_SYSTEM_PASSWORD: '''changeme'''
    networks:
      elk: null
    ports:
    - published: 5601
      target: 5601
    volumes:
    - /home/maineac/docker/docker-elk/kibana/config/kibana.yml:/usr/share/kibana/config/kibana.yml:ro,Z
  logstash:
    build:
      args:
        ELK_VERSION: 8.0.0
      context: /home/maineac/docker/docker-elk/logstash
    depends_on:
    - elasticsearch
    environment:
      LOGSTASH_INTERNAL_PASSWORD: '''changeme'''
      LS_JAVA_OPTS: -Xmx256m -Xms256m
    networks:
      elk: null
    ports:
    - published: 5044
      target: 5044
    - protocol: tcp
      published: 5000
      target: 5000
    - protocol: udp
      published: 5000
      target: 5000
    - published: 9600
      target: 9600
    volumes:
    - /home/maineac/docker/docker-elk/logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:ro,Z
    - /home/maineac/docker/docker-elk/logstash/pipeline:/usr/share/logstash/pipeline:ro,Z
  setup:
    build:
      args:
        ELK_VERSION: 8.0.0
      context: /home/maineac/docker/docker-elk/setup
    environment:
      ELASTIC_PASSWORD: '''changeme'''
      KIBANA_SYSTEM_PASSWORD: '''changeme'''
      LOGSTASH_INTERNAL_PASSWORD: '''changeme'''
    networks:
      elk: null
    volumes:
    - setup:/state:Z
version: '3.2'
volumes:
  elasticsearch: {}
  setup: {}

@antoineco
Copy link
Collaborator

antoineco commented Feb 28, 2022

@mAineAc- thanks for checking! Compose indeed did the wrong thing here:

ELASTIC_PASSWORD: '''changeme'''

I believe the conclusion is that we want to recommend users to use Compose ≥ 1.26.0, because we do recommend people to use quotes in order to avoid issues with passwords containing special characters, as seen in #667.

For people using Compose 1.22.0 to 1.25.5, the setup should be considered "unsupported but known to be working", with the recommendation to remove surrounding quotes from values in the .env file:

# .env
ELASTIC_VERSION=8.0.0
ELASTIC_PASSWORD=changeme
LOGSTASH_INTERNAL_PASSWORD=changeme
KIBANA_SYSTEM_PASSWORD=changeme

@antoineco antoineco changed the title new install cannot authenticate New install cannot authenticate, Compose keeps quotes in passwords read from the .env file Mar 1, 2022
@asgharkhan

This comment was marked as off-topic.

@antoineco
Copy link
Collaborator

@asgharkhan I converted your comment into a new issue (#680) because

  1. this issue is closed
  2. the problem seems unrelated

DanBrown47 pushed a commit to DanBrown47/docker-elk that referenced this issue Jun 22, 2023
Prior to this version, surrounding quotes (single and double) were
preserved in values coming from the '.env' file, which is counter
intuitive and goes against the recommended practice of quoting values
containing special characters, such as passwords.

Closes deviantony#677
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants