Class: GitLab::Exporter::SidekiqProber
- Inherits:
-
Object
- Object
- GitLab::Exporter::SidekiqProber
- Defined in:
- lib/gitlab_exporter/sidekiq.rb
Overview
A prober for Sidekiq queues
It takes the Redis URL Sidekiq is connected to
Constant Summary collapse
- PROBE_JOBS_LIMIT =
The maximum depth (from the head) of each queue to probe. Probing the entirety of a very large queue will take longer and run the risk of timing out. But when we have a very large queue, we are most in need of reliable metrics. This trades off completeness for predictability by only taking a limited amount of items from the head of the queue.
1_000
- POOL_SIZE =
3
- POOL_TIMEOUT =
This timeout is configured to higher interval than scrapping of Prometheus to ensure that connection is kept instead of needed to be re-initialized
90
- SIDEKIQ_REDIS_LOCK =
Lock for Sidekiq.redis which we need to modify, but is not concurrency safe.
Mutex.new
Class Method Summary collapse
Instance Method Summary collapse
-
#initialize(metrics: PrometheusMetrics.new, logger: nil, **opts) ⇒ SidekiqProber
constructor
A new instance of SidekiqProber.
- #probe_dead ⇒ Object
- #probe_future_sets ⇒ Object
- #probe_jobs ⇒ Object
-
#probe_jobs_limit ⇒ Object
Count worker classes present in Sidekiq queues.
- #probe_queues ⇒ Object
- #probe_retries ⇒ Object
- #probe_stats ⇒ Object
- #probe_workers ⇒ Object
- #write_to(target) ⇒ Object
Constructor Details
#initialize(metrics: PrometheusMetrics.new, logger: nil, **opts) ⇒ SidekiqProber
Returns a new instance of SidekiqProber.
39 40 41 42 43 |
# File 'lib/gitlab_exporter/sidekiq.rb', line 39 def initialize(metrics: PrometheusMetrics.new, logger: nil, **opts) @opts = opts @metrics = metrics @logger = logger end |
Class Method Details
.connection_pool ⇒ Object
31 32 33 34 35 36 37 |
# File 'lib/gitlab_exporter/sidekiq.rb', line 31 def self.connection_pool @@connection_pool ||= Hash.new do |h, connection_hash| # rubocop:disable Style/ClassVars config = connection_hash.merge(pool_timeout: POOL_TIMEOUT, size: POOL_SIZE) h[connection_hash] = Sidekiq::RedisConnection.create(config) end end |
Instance Method Details
#probe_dead ⇒ Object
166 167 168 169 170 171 172 173 174 175 |
# File 'lib/gitlab_exporter/sidekiq.rb', line 166 def probe_dead puts "[DEPRECATED] probe_dead is now considered obsolete and will be removed in future major versions,"\ " please use probe_stats instead" with_sidekiq do @metrics.add("sidekiq_dead_jobs", Sidekiq::Stats.new.dead_size) end self end |
#probe_future_sets ⇒ Object
83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 |
# File 'lib/gitlab_exporter/sidekiq.rb', line 83 def probe_future_sets now = Time.now.to_f with_sidekiq do Sidekiq.redis do |conn| Sidekiq::Scheduled::SETS.each do |set| # Default to 0; if all jobs are due in the future, there is no "negative" delay. delay = 0 _job, = conn.zrangebyscore(set, "-inf", now.to_s, limit: [0, 1], withscores: true).first delay = now - if @metrics.add("sidekiq_#{set}_set_processing_delay_seconds", delay) # zcount is O(log(N)) (prob. binary search), so is still quick even with large sets @metrics.add("sidekiq_#{set}_set_backlog_count", conn.zcount(set, "-inf", now.to_s)) end end end end |
#probe_jobs ⇒ Object
76 77 78 79 80 81 |
# File 'lib/gitlab_exporter/sidekiq.rb', line 76 def probe_jobs puts "[REMOVED] probe_jobs is now considered obsolete and does not emit any metrics,"\ " please use probe_jobs_limit instead" self end |
#probe_jobs_limit ⇒ Object
Count worker classes present in Sidekiq queues. This only looks at the first PROBE_JOBS_LIMIT jobs in each queue. This means that we run a single LRANGE command for each queue, which does not block other commands. For queues over PROBE_JOBS_LIMIT in size, this means that we will not have completely accurate statistics, but the probe performance will also not degrade as the queue gets larger.
110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 |
# File 'lib/gitlab_exporter/sidekiq.rb', line 110 def probe_jobs_limit with_sidekiq do job_stats = Hash.new(0) Sidekiq::Queue.all.each do |queue| Sidekiq.redis do |conn| conn.lrange("queue:#{queue.name}", 0, PROBE_JOBS_LIMIT).each do |job| job_class = Sidekiq.load_json(job)["class"] job_stats[job_class] += 1 end end end job_stats.each do |class_name, count| @metrics.add("sidekiq_enqueued_jobs", count, name: class_name) end end self end |
#probe_queues ⇒ Object
64 65 66 67 68 69 70 71 72 73 74 |
# File 'lib/gitlab_exporter/sidekiq.rb', line 64 def probe_queues with_sidekiq do Sidekiq::Queue.all.each do |queue| @metrics.add("sidekiq_queue_size", queue.size, name: queue.name) @metrics.add("sidekiq_queue_latency_seconds", queue.latency, name: queue.name) @metrics.add("sidekiq_queue_paused", queue.paused? ? 1 : 0, name: queue.name) end end self end |
#probe_retries ⇒ Object
150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 |
# File 'lib/gitlab_exporter/sidekiq.rb', line 150 def probe_retries with_sidekiq do retry_stats = Hash.new(0) Sidekiq::RetrySet.new.map do |job| retry_stats[job.klass] += 1 end retry_stats.each do |class_name, count| @metrics.add("sidekiq_to_be_retried_jobs", count, name: class_name) end end self end |
#probe_stats ⇒ Object
45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 |
# File 'lib/gitlab_exporter/sidekiq.rb', line 45 def probe_stats with_sidekiq do stats = Sidekiq::Stats.new @metrics.add("sidekiq_jobs_processed_total", stats.processed) @metrics.add("sidekiq_jobs_failed_total", stats.failed) @metrics.add("sidekiq_jobs_enqueued_size", stats.enqueued) @metrics.add("sidekiq_jobs_scheduled_size", stats.scheduled_size) @metrics.add("sidekiq_jobs_retry_size", stats.retry_size) @metrics.add("sidekiq_jobs_dead_size", stats.dead_size) @metrics.add("sidekiq_default_queue_latency_seconds", stats.default_queue_latency) @metrics.add("sidekiq_processes_size", stats.processes_size) @metrics.add("sidekiq_workers_size", stats.workers_size) end self end |
#probe_workers ⇒ Object
132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 |
# File 'lib/gitlab_exporter/sidekiq.rb', line 132 def probe_workers with_sidekiq do worker_stats = Hash.new(0) Sidekiq::Workers.new.map do |_pid, _tid, work| job_klass = work["payload"]["class"] worker_stats[job_klass] += 1 end worker_stats.each do |class_name, count| @metrics.add("sidekiq_running_jobs", count, name: class_name) end end self end |
#write_to(target) ⇒ Object
177 178 179 |
# File 'lib/gitlab_exporter/sidekiq.rb', line 177 def write_to(target) target.write(@metrics.to_s) end |