Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change pidfile handling, always add index to pidfile name #116

Merged
merged 1 commit into from
Oct 27, 2015

Conversation

w1mvy
Copy link
Contributor

@w1mvy w1mvy commented Oct 27, 2015

Fix this issue. #44

Changed always add index to pidfile.

seuros added a commit that referenced this pull request Oct 27, 2015
Change pidfile handling, always add index to pidfile name
@seuros seuros merged commit 71a4278 into seuros:master Oct 27, 2015
else
pid_file = fetch(:sidekiq_pid).gsub(/\.pid$/, "-#{idx}.pid")
end
pid_file = fetch(:sidekiq_pid).gsub(/\.pid$/, "-#{idx}.pid")

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Um, guys?

This overrides our pidfile setting in sidekiq.yml, and that's been causing some rather serious problems.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what serious problems ?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problems all stem from the fact that capistrano-sidekiq is now incompatible with the command line tool ‘sidekiq’ with the -C option for a config file.

For a week we’ve been chasing unexpected multiple instances of sidekiq running against the same queues, unable to figure out where the errant processes have been coming from. We use multiple tools to start and stop sidekiq, and we really, really need all of the tools we use to do the same thing, not least because we have a hard ceiling on sidekiq concurrency at 12, above which our solr server will stop accepting connections and bring down the whole app.

I do understand the benefit of the naming scheme for those running multiple sidekiq instances on a single machine, but for anyone else it is completely unexpected. And in our case at least, it breaks system management, which breaks the app.

We’ve reverted to 0.5.3.

On Dec 4, 2015, at 2:27 PM, Abdelkader Boudih notifications@github.com wrote:

In lib/capistrano/tasks/capistrano2.rb:

@@ -34,11 +34,7 @@
def for_each_process(sidekiq_role, &block)
sidekiq_processes = fetch(:"#{ sidekiq_role }_processes") rescue 1
sidekiq_processes.times do |idx|

  •    if idx.zero? && sidekiq_processes <= 1
    
  •      pid_file = fetch(:sidekiq_pid)
    
  •    else
    
  •      pid_file = fetch(:sidekiq_pid).gsub(/.pid$/, "-#{idx}.pid")
    
  •    end
    
  •    pid_file = fetch(:sidekiq_pid).gsub(/.pid$/, "-#{idx}.pid")
    

what serious problems ?


Reply to this email directly or view it on GitHub.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here. I think this should be an option.
Instead of modifying the passed PID, which is quite unexpected, as @edslocomb mentioned,
why not just accept a pid of the form my-file-%d.pid and just sprintf on it?
It would work as well while staying explicit and avoid breaking things.
If this seems fine for you, I will send a PR, so let me know what you think.

metavida added a commit to haikulearning/capistrano-sidekiq that referenced this pull request Jan 26, 2016
Also update the default value to reflect the change made in seuros#116
@ljachymczyk
Copy link

I have a problem with this changes too. I have a sidekiq.yml file which specifies pidfile location and appropriate sidekiq_config capistrano variable. And still the pidfile is appended with '-0'.

Specifying the pidfile directly in capistrano variable sidekiq_pid also does not change this odd behaviour.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants