Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Module does not seem to work with splunkforwarder 7.X #186

Closed
rbclark opened this issue Jun 26, 2018 · 6 comments · Fixed by #191
Closed

Module does not seem to work with splunkforwarder 7.X #186

rbclark opened this issue Jun 26, 2018 · 6 comments · Fixed by #191

Comments

@rbclark
Copy link
Contributor

rbclark commented Jun 26, 2018

Affected Puppet, Ruby, OS and module versions/distributions

  • Puppet: 4.10.6
  • Ruby: 2.0.0p648
  • Distribution: Centos
  • Module version: 7.2.0

How to reproduce (e.g Puppet code you use)

  class { '::splunk::params':
    version => '=7.1.1',
    build   => '8f0ead9ec3db',
  }

  class { '::splunk::forwarder':
    pkg_provider => 'yum',
    splunk_user  => 'splunk',
  }

What are you seeing

Notice: /Stage[main]/Splunk::Forwarder/Package[splunkforwarder]/ensure: created
Notice: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns: Password must contain at least:
Notice: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns:    * 8 total printable ASCII character(s).
Notice: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns: 
Notice: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns: This appears to be your first time running this version of Splunk.
Notice: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns: 
Notice: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns: An Admin password must be set before installation proceeds.
Notice: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns: tcgetattr: Inappropriate ioctl for device
Notice: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns: WARNING: error changing terminal modes - password will echo!
Notice: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns: Please enter a new password: 
Error: 'splunk start --accept-license --answer-yes' returned 1 instead of one of [0]
Error: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns: change from notrun to 0 failed: 'splunk start --accept-license --answer-yes' returned 1 instead of one of [0]

What behaviour did you expect instead

A successful install and configuration of the splunkforwarder.

Any additional information you'd like to impart

Splunk Enterprise 7.1 introduces a new password scheme for Splunk software users. This scheme includes additional settings and configuration options, which can affect how you upgrade if you use scripts to automate the upgrade process. You might need to change your upgrade scripts before performing scripted upgrades. Specifically, confirm that you do not pass any illegal arguments to the Splunk CLI for starting or restarting Splunk Enterprise during the upgrade, as this could result in a situation where Splunk Enterprise does not start after the upgrade has completed.

From http://docs.splunk.com/Documentation/Splunk/7.1.1/Installation/AboutupgradingREADTHISFIRST

@kreeuwijk
Copy link
Contributor

This is new behavior in Splunk Forwarder 7.1 and above. Downgrading to a 7.0.x version will work as a temporary workaround.

@ralfbosz
Copy link
Contributor

The platform/posix.pp exec's need " --no-prompt".

ralfbosz added a commit to ralfbosz/puppet-splunk that referenced this issue Aug 16, 2018
When splunk is started the first time on 7.1.x (and
higher) there is input expected, disable this with
--no-prompt.
This addresses issue voxpupuli#186
ralfbosz added a commit to ralfbosz/puppet-splunk that referenced this issue Aug 16, 2018
When splunk is started the first time on 7.1.x (and
higher) there is input expected, disable this with
--no-prompt. Fixes voxpupuli#186
ralfbosz added a commit to ralfbosz/puppet-splunk that referenced this issue Aug 16, 2018
When splunk is started the first time on 7.1.x (and
higher) there is input expected, disable this with
--no-prompt. Fixes voxpupuli#186
ralfbosz added a commit to ralfbosz/puppet-splunk that referenced this issue Aug 16, 2018
When splunk is started the first time on 7.1.x (and
higher) there is input expected, disable this with
--no-prompt. Fixes voxpupuli#186
ralfbosz added a commit to ralfbosz/puppet-splunk that referenced this issue Aug 16, 2018
When splunk is started the first time on 7.1.x (and
higher) there is input expected, disable this with
--no-prompt. Fixes voxpupuli#186
@ralfbosz
Copy link
Contributor

Sorry for the many pushes, a work-around can be done by using a user-seed.conf where you define a NEW password. That way the --no-prompt is not needed. Only tested on RedHat 7.

Also good way to set the password...

@juanlittledevil
Copy link

This did not actually solve the issue. Also, I don't see any support for user-seed.conf in the module. A simpler solution would be simply add --seed-passwd changeme instead of --no-prompt. As this makes splunk behave like it did prior to 7.1.x

@zhasaan
Copy link

zhasaan commented Apr 11, 2019

I am using splunkforwarder version 7.2.2 and seeing following behaviour in my local acceptance testing

==================================================================
Notice: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/server.conf]/ensure: created
Debug: /Stage[main]/Splunk::Forwarder/File[/opt/splunkforwarder/etc/system/local/server.conf]: The container Class[Splunk::Forwarder] will propagate my refresh event
Debug: Class[Splunk::Forwarder]: The container Stage[main] will propagate my refresh event
Debug: Execlicense_splunkforwarder: Executing check '/usr/bin/test -f /opt/splunkforwarder/ftr'
Debug: Executing with uid=splunk: '/usr/bin/test -f /opt/splunkforwarder/ftr'
Debug: Execlicense_splunkforwarder: Executing 'splunk ftr --accept-license --answer-yes --no-prompt'
Debug: Executing with uid=splunk: 'splunk ftr --accept-license --answer-yes --no-prompt'
Notice: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]/returns: executed successfully
Info: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]: Scheduling refresh of Service[splunk]
Debug: /Stage[main]/Splunk::Platform::Posix/Exec[license_splunkforwarder]: The container Class[Splunk::Platform::Posix] will propagate my refresh event
Debug: Execenable_splunkforwarder: Executing '/opt/splunkforwarder/bin/splunk enable boot-start -user splunk'
Debug: Executing: '/opt/splunkforwarder/bin/splunk enable boot-start -user splunk'
Notice: /Stage[main]/Splunk::Platform::Posix/Exec[enable_splunkforwarder]/returns: executed successfully
Info: /Stage[main]/Splunk::Platform::Posix/Exec[enable_splunkforwarder]: Scheduling refresh of Service[splunk]
Debug: /Stage[main]/Splunk::Platform::Posix/Exec[enable_splunkforwarder]: The container Class[Splunk::Platform::Posix] will propagate my refresh event
Debug: Class[Splunk::Platform::Posix]: The container Stage[main] will propagate my refresh event
Debug: Executing: '/usr/bin/systemctl is-active splunk'
Debug: Executing: '/usr/bin/systemctl is-enabled splunk'
Debug: Executing: '/usr/bin/systemctl unmask splunk'
Debug: Executing: '/usr/bin/systemctl start splunk'
Debug: Runing journalctl command to get logs for systemd start failure: journalctl -n 50 --since '5 minutes ago' -u splunk --no-pager
Debug: Executing: 'journalctl -n 50 --since '5 minutes ago' -u splunk --no-pager'
Debug: Executing: '/usr/bin/systemctl is-active splunk'
Error: Systemd start for splunk failed!
journalctl log for splunk:
-- No entries --

   	Error: /Stage[main]/Splunk::Virtual/Service[splunk]/ensure: change from stopped to running failed: Systemd start for splunk failed!
   	journalctl log for splunk:
   	-- No entries --
   	
   	Debug: /Stage[main]/Splunk::Virtual/Service[splunk]: Skipping restart; service is not running
   	Notice: /Stage[main]/Splunk::Virtual/Service[splunk]: Triggered 'refresh' from 6 events
   	Debug: /Stage[main]/Splunk::Virtual/Service[splunk]: The container Class[Splunk::Virtual] will propagate my refresh event
   	Debug: Class[Splunk::Virtual]: Resource is being skipped, unscheduling all events
   	Info: Class[Splunk::Virtual]: Unscheduling all events on Class[Splunk::Virtual]
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [default] host]/value: value changed '[redacted sensitive information]' to '[redacted sensitive information]'
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [default] host]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/audit/audit.log] disabled]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/audit/audit.log] disabled]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/audit/audit.log] followTail]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/audit/audit.log] followTail]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/audit/audit.log] index]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/audit/audit.log] index]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/audit/audit.log] sourcetype]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/audit/audit.log] sourcetype]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/audit/audit.log] crcSalt]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/audit/audit.log] crcSalt]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/auth.log] disabled]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/auth.log] disabled]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/auth.log] followTail]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/auth.log] followTail]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/auth.log] index]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/auth.log] index]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/auth.log] sourcetype]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/auth.log] sourcetype]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/auth.log] crcSalt]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/auth.log] crcSalt]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/messages] disabled]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/messages] disabled]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/messages] followTail]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/messages] followTail]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/messages] index]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/messages] index]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/messages] sourcetype]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/messages] sourcetype]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/messages] crcSalt]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///var/log/messages] crcSalt]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/outputs.conf [tcpout] defaultGroup]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/outputs.conf [tcpout] defaultGroup]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/outputs.conf [tcpout] disabled]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/outputs.conf [tcpout] disabled]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/outputs.conf [tcpout:Production] server]/ensure: created
   	Debug: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/outputs.conf [tcpout:Production] server]: The container Class[Pumasplunk::Config] will propagate my refresh event
   	Debug: Class[Pumasplunk::Config]: The container Stage[main] will propagate my refresh event
   	Debug: Class[Pumasplunk::Config]: The container Class[Pumasplunk] will propagate my refresh event
   	Debug: Class[Pumasplunk]: The container Stage[main] will propagate my refresh event
   	Debug: Stage[main]: Resource is being skipped, unscheduling all events
   	Info: Stage[main]: Unscheduling all events on Stage[main]
   	Debug: Finishing transaction 29608740
   	Debug: Storing state
   	Info: Creating state file /opt/puppetlabs/puppet/cache/state/state.yaml
   	Debug: Stored state in 0.01 seconds
   	Notice: Applied catalog in 139.19 seconds
   	Debug: Applying settings catalog for sections reporting, metrics
   	Debug: Finishing transaction 27656040
   	Debug: Received report to process from centos-7-x64.core.dir.telstra.com
   	Debug: Evicting cache entry for environment 'production'
   	Debug: Caching environment 'production' (ttl = 0 sec)
   	Debug: Processing report from centos-7-x64.core.dir.telstra.com with processor Puppet::Reports::Store
   
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/host.rb:375:in `exec'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/dsl/helpers/host_helpers.rb:83:in `block in on'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/shared/host_manager.rb:127:in `run_block_on'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/dsl/patterns.rb:37:in `block_on'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/dsl/helpers/host_helpers.rb:63:in `on'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-puppet-0.17.1/lib/beaker-puppet/helpers/puppet_helpers.rb:521:in `block in apply_manifest_on'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/shared/host_manager.rb:127:in `run_block_on'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/dsl/patterns.rb:37:in `block_on'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-puppet-0.17.1/lib/beaker-puppet/helpers/puppet_helpers.rb:450:in `apply_manifest_on'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-puppet-0.17.1/lib/beaker-puppet/helpers/puppet_helpers.rb:528:in `apply_manifest'
 # ./spec/acceptance/pumasplunk_spec.rb:23:in `block (3 levels) in <top (required)>'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:123:in `block in run'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:110:in `loop'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:110:in `run'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec_ext/rspec_ext.rb:12:in `run_with_retry'
 # /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:37:in `block (2 levels) in setup'
  1. pumasplunk install with default parameters is idempotent
    Failure/Error: result = apply_manifest(pp, catch_changes: true)
    Beaker::Host::CommandFailure:
    Host 'centos-7-x64' exited with 6 running:
    puppet apply --verbose --detailed-exitcodes /tmp/apply_manifest.pp.v7sESR
    Last 200 lines of output were:
    2019-04-11 13:29:30.475607 WARN puppetlabs.facter - locale environment variables were bad; continuing with LANG=C LC_ALL=C
    Info: Loading facts
    Info: Loading facts
    Warning: /etc/puppetlabs/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
    (in /etc/puppetlabs/puppet/hiera.yaml)
    Notice: Compiled catalog for centos-7-x64.core.dir.telstra.com in environment production in 0.77 seconds
    Info: Applying configuration version '1554989374'
    Notice: /Stage[main]/Splunk::Forwarder/Splunkforwarder_input[default_host]/value: value changed 'myfqdn' to 'centos-7-x64.core.dir.telstra.com'
    Info: /Stage[main]/Splunk::Forwarder/Splunkforwarder_input[default_host]: Scheduling refresh of Service[splunk]
    Notice: /Stage[main]/Splunk::Platform::Posix/Exec[enable_splunkforwarder]/returns: Unable to create symlink='/etc/systemd/system/multi-user.target.wants/SplunkForwarder.service'
    Notice: /Stage[main]/Splunk::Platform::Posix/Exec[enable_splunkforwarder]/returns: : File exists
    Error: '/opt/splunkforwarder/bin/splunk enable boot-start -user splunk' returned 8 instead of one of [0]
    Error: /Stage[main]/Splunk::Platform::Posix/Exec[enable_splunkforwarder]/returns: change from notrun to 0 failed: '/opt/splunkforwarder/bin/splunk enable boot-start -user splunk' returned 8 instead of one of [0]
    Notice: /Stage[main]/Splunk::Virtual/Service[splunk]: Dependency Exec[enable_splunkforwarder] has failures: true
    Info: /Stage[main]/Splunk::Virtual/Service[splunk]: Unscheduling all events on Service[splunk]
    Warning: /Stage[main]/Splunk::Virtual/Service[splunk]: Skipping because of failed dependencies
    Notice: /Stage[main]/Pumasplunk::Config/Ini_setting[/opt/splunkforwarder/etc/system/local/inputs.conf [default] host]/value: value changed '[redacted sensitive information]' to '[redacted sensitive information]'
    Info: Stage[main]: Unscheduling all events on Stage[main]
    Notice: Applied catalog in 0.27 seconds

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/host.rb:375:in `exec'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/dsl/helpers/host_helpers.rb:83:in `block in on'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/shared/host_manager.rb:127:in `run_block_on'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/dsl/patterns.rb:37:in `block_on'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/dsl/helpers/host_helpers.rb:63:in `on'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-puppet-0.17.1/lib/beaker-puppet/helpers/puppet_helpers.rb:521:in `block in apply_manifest_on'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/shared/host_manager.rb:127:in `run_block_on'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-3.22.0/lib/beaker/dsl/patterns.rb:37:in `block_on'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-puppet-0.17.1/lib/beaker-puppet/helpers/puppet_helpers.rb:450:in `apply_manifest_on'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/beaker-puppet-0.17.1/lib/beaker-puppet/helpers/puppet_helpers.rb:528:in `apply_manifest'

    ./spec/acceptance/pumasplunk_spec.rb:27:in `block (3 levels) in <top (required)>'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:123:in `block in run'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:110:in `loop'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:110:in `run'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec_ext/rspec_ext.rb:12:in `run_with_retry'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:37:in `block (2 levels) in setup'

  2. pumasplunk install with default parameters Service "SplunkForwarder" should be running
    Failure/Error: it { is_expected.to be_running }
    expected Service "SplunkForwarder" to be running

    ./spec/acceptance/pumasplunk_spec.rb:41:in `block (4 levels) in <top (required)>'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:123:in `block in run'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:110:in `loop'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:110:in `run'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec_ext/rspec_ext.rb:12:in `run_with_retry'

    /home/vagrant/.rvm/gems/ruby-2.2.10/gems/rspec-retry-0.6.1/lib/rspec/retry.rb:37:in `block (2 levels) in setup'

Finished in 3 minutes 45 seconds (files took 11.63 seconds to load)
6 examples, 3 failures

Failed examples:

rspec ./spec/acceptance/pumasplunk_spec.rb:22 # pumasplunk install with default parameters applies with no errors
rspec ./spec/acceptance/pumasplunk_spec.rb:26 # pumasplunk install with default parameters is idempotent
rspec ./spec/acceptance/pumasplunk_spec.rb:41 # pumasplunk install with default parameters Service "SplunkForwarder" should be running

Since I am testing Puppet code inside a local docker/container , following what I can see inside container when splunkforwarer package installation gets completed

[root@centos-7-x64 /]# ls -l /etc/systemd/system/SplunkForwarder.service;cat /etc/systemd/system/SplunkForwarder.service;ls -l /etc/systemd/system/multi-user.target.wants/;ps -ef |grep systemctl
-rwx------. 1 root root 793 Apr 11 13:29 /etc/systemd/system/SplunkForwarder.service
#This unit file replaces the traditional start-up script for systemd
#configurations, and is used when enabling boot-start for Splunk on
#systemd-based Linux distributions.

[Unit]
Description=Systemd service file for Splunk, generated by 'splunk enable boot-start'
After=network.target

[Service]
Type=simple
Restart=always
ExecStart=/opt/splunkforwarder/bin/splunk _internal_launch_under_systemd
LimitNOFILE=65536
SuccessExitStatus=51 52
RestartPreventExitStatus=51
RestartForceExitStatus=52
User=splunk
Delegate=true
MemoryLimit=100G
CPUShares=1024
PermissionsStartOnly=true
ExecStartPost=/bin/bash -c "chown -R splunk:splunk /sys/fs/cgroup/cpu/system.slice/%n"
ExecStartPost=/bin/bash -c "chown -R splunk:splunk /sys/fs/cgroup/memory/system.slice/%n"

[Install]
WantedBy=multi-user.target
total 0
lrwxrwxrwx. 1 root root 37 Mar 5 04:19 crond.service -> /usr/lib/systemd/system/crond.service
lrwxrwxrwx. 1 root root 46 Mar 5 04:19 rhel-configure.service -> /usr/lib/systemd/system/rhel-configure.service
lrwxrwxrwx. 1 root root 43 Apr 11 13:29 SplunkForwarder.service -> /etc/systemd/system/SplunkForwarder.service
lrwxrwxrwx. 1 root root 36 Apr 3 01:49 sshd.service -> /usr/lib/systemd/system/sshd.service
root 2517 1905 0 13:29 ? 00:00:00 grep --color=auto systemctl

Any idea as why "systemctl start splunk" is failing ?

@alexjfisher
Copy link
Member

@zhasaan A large refactoring to support splunk 7.2 has just been merged

It's not in a release yet, but if you want to test the version from the master branch, hopefully this will work better for you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
6 participants