Class: Sidekiq::ProcessSet
- Inherits:
-
Object
- Object
- Sidekiq::ProcessSet
- Includes:
- Enumerable
- Defined in:
- lib/sidekiq/api.rb
Overview
Enumerates the set of Sidekiq processes which are actively working right now. Each process send a heartbeat to Redis every 5 seconds so this set should be relatively accurate, barring network partitions.
Yields a Sidekiq::Process.
Class Method Summary collapse
-
.cleanup ⇒ Object
Cleans up dead processes recorded in Redis.
Instance Method Summary collapse
- #each ⇒ Object
-
#initialize(clean_plz = true) ⇒ ProcessSet
constructor
A new instance of ProcessSet.
-
#size ⇒ Object
This method is not guaranteed accurate since it does not prune the set based on current heartbeat.
Constructor Details
#initialize(clean_plz = true) ⇒ ProcessSet
Returns a new instance of ProcessSet.
628 629 630 |
# File 'lib/sidekiq/api.rb', line 628 def initialize(clean_plz=true) self.class.cleanup if clean_plz end |
Class Method Details
.cleanup ⇒ Object
Cleans up dead processes recorded in Redis. Returns the number of processes cleaned.
634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 |
# File 'lib/sidekiq/api.rb', line 634 def self.cleanup count = 0 Sidekiq.redis do |conn| procs = conn.smembers('processes').sort heartbeats = conn.pipelined do procs.each do |key| conn.hget(key, 'info') end end # the hash named key has an expiry of 60 seconds. # if it's not found, that means the process has not reported # in to Redis and probably died. to_prune = [] heartbeats.each_with_index do |beat, i| to_prune << procs[i] if beat.nil? end count = conn.srem('processes', to_prune) unless to_prune.empty? end count end |
Instance Method Details
#each ⇒ Object
656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 |
# File 'lib/sidekiq/api.rb', line 656 def each procs = Sidekiq.redis { |conn| conn.smembers('processes') }.sort Sidekiq.redis do |conn| # We're making a tradeoff here between consuming more memory instead of # making more roundtrips to Redis, but if you have hundreds or thousands of workers, # you'll be happier this way result = conn.pipelined do procs.each do |key| conn.hmget(key, 'info', 'busy', 'beat') end end result.each do |info, busy, at_s| hash = Sidekiq.load_json(info) yield Process.new(hash.merge('busy' => busy.to_i, 'beat' => at_s.to_f)) end end nil end |
#size ⇒ Object
This method is not guaranteed accurate since it does not prune the set based on current heartbeat. #each does that and ensures the set only contains Sidekiq processes which have sent a heartbeat within the last 60 seconds.
682 683 684 |
# File 'lib/sidekiq/api.rb', line 682 def size Sidekiq.redis { |conn| conn.scard('processes') } end |