Class: Batch

Inherits:
ApplicationRecord show all
Extended by:
EventfulRecord
Includes:
AASM, Api::BatchIO::Extensions, Api::Messages::FlowcellIO::Extensions, PipelineBehaviour, StateMachineBehaviour, Commentable, SequencingQcBatch, StandardNamedScopes, Uuid::Uuidable
Defined in:
app/models/batch.rb

Overview

A Batch groups 1 or more requests together to enable processing in a Pipeline. All requests in a batch get usually processed together, although it is possible for requests to get removed from a batch in a handful of cases.

Defined Under Namespace

Modules: PipelineBehaviour, RequestBehaviour, StateMachineBehaviour Classes: RequestFailAndRemover

Constant Summary collapse

DEFAULT_VOLUME =

The three states of Batch Also @see SequencingQcBatch

13

Constants included from StandardNamedScopes

StandardNamedScopes::SORT_FIELDS, StandardNamedScopes::SORT_ORDERS

Constants included from SequencingQcBatch

SequencingQcBatch::VALID_QC_STATES

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Methods included from EventfulRecord

has_many_events, has_many_lab_events, has_one_event_with_family

Methods included from StateMachineBehaviour

#complete_with_user!, #editable?, #finished?, included, #release_with_user!, #start_with_user!

Methods included from PipelineBehaviour

#has_item_limit?, included, #last_completed_task

Methods included from StandardNamedScopes

included

Methods included from Uuid::Uuidable

included, #unsaved_uuid!, #uuid

Methods included from Commentable

#after_comment_addition

Methods included from SequencingQcBatch

adjacent_state_helper, included, #processing_in_manual_qc?, #qc_manual_in_progress, #qc_previous_state!, #qc_states, state_transition_helper

Methods included from Api::BatchIO::Extensions

included

Methods inherited from ApplicationRecord

convert_labware_to_receptacle_for, find_by_id_or_name, find_by_id_or_name!

Methods included from Squishify

extended

Methods included from Warren::BroadcastMessages

#broadcast, included, #queue_associated_for_broadcast, #queue_for_broadcast, #warren

Instance Attribute Details

#production_stateObject

Also referenced in StateMachineBehaviour. Either nil, or fail. This is updated in Batch#fail_requests and Batch#fail. The former is used via BatchesController#fail_items, the latter seems to be unused. Is intended to take precedence over both other states to track failures in-spite of QC results.


29
# File 'app/models/batch.rb', line 29

DEFAULT_VOLUME = 13

#qc_stateObject

Primarily for sequencing batches. See SequencingQcBatch. Holds the sequencing QC state


29
# File 'app/models/batch.rb', line 29

DEFAULT_VOLUME = 13

#stateObject

The main state machine, used to track the batch through the pipeline. Handled by StateMachineBehaviour


29
# File 'app/models/batch.rb', line 29

DEFAULT_VOLUME = 13

Class Method Details

.barcode_without_pick_number(code) ⇒ Object


497
498
499
# File 'app/models/batch.rb', line 497

def self.barcode_without_pick_number(code)
  code.split('-').first
end

.extract_pick_number(code) ⇒ Object


501
502
503
504
505
506
507
508
# File 'app/models/batch.rb', line 501

def self.extract_pick_number(code)
  # expecting format 550000555760-1 with pick number at end
  split_code = code.split('-')
  return Integer(split_code.last) if split_code.size > 1

  # default to 1 if the pick number is not present
  1
end

.find_by_barcode(code) ⇒ Object Also known as: find_from_barcode


511
512
513
514
515
516
517
518
# File 'app/models/batch.rb', line 511

def find_by_barcode(code)
  split_code = barcode_without_pick_number(code)
  human_batch_barcode = Barcode.number_to_human(split_code)
  batch = Batch.find_by(barcode: human_batch_barcode)
  batch ||= Batch.find_by(id: human_batch_barcode)

  batch
end

.prefixObject


478
479
480
# File 'app/models/batch.rb', line 478

def self.prefix
  'BA'
end

.valid_barcode?(code) ⇒ Boolean

Returns:

  • (Boolean)

482
483
484
485
486
487
488
489
490
491
492
493
494
495
# File 'app/models/batch.rb', line 482

def self.valid_barcode?(code)
  begin
    split_code = barcode_without_pick_number(code)
    Barcode.barcode_to_human!(split_code, prefix)
  rescue
    return false
  end

  if find_from_barcode(code).nil?
    return false
  end

  true
end

Instance Method Details

#all_requests_are_ready?Boolean

Returns:

  • (Boolean)

107
108
109
110
111
112
113
# File 'app/models/batch.rb', line 107

def all_requests_are_ready?
  # Checks that SequencingRequests have at least one LibraryCreationRequest in passed status before being processed
  # (as referred by #75102998)
  unless requests.all?(&:ready?)
    errors.add :base, 'All requests must be ready to be added to a batch'
  end
end

#assign_positions_to_requests!(request_ids_in_position_order) ⇒ Object

Sets the position of the requests in the batch to their index in the array.

Raises:

  • (StandardError)

211
212
213
214
215
216
217
218
219
220
# File 'app/models/batch.rb', line 211

def assign_positions_to_requests!(request_ids_in_position_order)
  disparate_ids = batch_requests.map(&:request_id) - request_ids_in_position_order
  raise StandardError, 'Can only sort all requests at once' unless disparate_ids.empty?

  BatchRequest.transaction do
    batch_requests.each do |batch_request|
      batch_request.move_to_position!(request_ids_in_position_order.index(batch_request.request_id) + 1)
    end
  end
end

#assigned_userObject


224
225
226
# File 'app/models/batch.rb', line 224

def assigned_user
  assignee.try(:login) || ''
end

#batch_meets_minimum_sizeObject


127
128
129
130
131
# File 'app/models/batch.rb', line 127

def batch_meets_minimum_size
  if min_size && (requests.size < min_size)
    errors.add :base, "You must create batches of at least #{min_size} requests in the pipeline #{pipeline.name}"
  end
end

#controlObject


202
203
204
# File 'app/models/batch.rb', line 202

def control
  requests.detect { |request| request.try(:asset).try(:resource?) }
end

#detach_request(request, current_user = nil) ⇒ Object

Remove a request from the batch and reset it to a point where it can be put back into the pending queue.


363
364
365
366
367
368
369
370
371
# File 'app/models/batch.rb', line 363

def detach_request(request, current_user = nil)
  ActiveRecord::Base.transaction do
    unless current_user.nil?
      request.add_comment("Used to belong to Batch #{id} removed at #{Time.zone.now}",
                          current_user)
    end
    pipeline.detach_request_from_batch(self, request)
  end
end

#display_tags?Boolean

Returns:

  • (Boolean)

298
299
300
# File 'app/models/batch.rb', line 298

def display_tags?
  multiplexed?
end

#displayed_statusObject

Summarise the state encapsulated by state and production_state Essentially a 'fail' production_state over-rides the 'state' We don't use production_state directly as it it 'fail' rather than ' failed' qc_state it kept separate as its a fairly distinct concept and is summarised elsewhere in the interface.


562
563
564
# File 'app/models/batch.rb', line 562

def displayed_status
  failed? ? 'failed' : state
end

#downstream_requests_needing_asset(request) {|next_requests_needing_asset| ... } ⇒ Object

Yields:

  • (next_requests_needing_asset)

543
544
545
546
# File 'app/models/batch.rb', line 543

def downstream_requests_needing_asset(request)
  next_requests_needing_asset = request.next_requests.select { |r| r.asset_id.blank? }
  yield(next_requests_needing_asset) if next_requests_needing_asset.present?
end

#event_with_description(name) ⇒ Object


190
191
192
# File 'app/models/batch.rb', line 190

def event_with_description(name)
  lab_events.order(id: :desc).find_by(description: name)
end

#eventful_studiesObject


119
120
121
# File 'app/models/batch.rb', line 119

def eventful_studies
  requests.reduce([]) { |studies, request| studies.concat(request.eventful_studies) }.uniq
end

#fail(reason, comment, ignore_requests = false) ⇒ Object

Fail was removed from State Machine (as a state) to allow the addition of qc_state column and features

Raises:

  • (StandardError)

140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
# File 'app/models/batch.rb', line 140

def fail(reason, comment, ignore_requests = false)
  # We've deprecated the ability to fail a batch but not its requests.
  # Keep this check here until we're sure we haven't missed anything.
  raise StandardError, 'Can not fail batch without failing requests' if ignore_requests

  # create failures
  failures.create(reason: reason, comment: comment, notify_remote: false)

  requests.each do |request|
    request.failures.create(reason: reason, comment: comment, notify_remote: true)
    unless request.asset && request.asset.resource?
      EventSender.send_fail_event(request, reason, comment, id)
    end
  end

  self.production_state = 'fail'
  save!
end

#fail_requests(requests_to_fail, reason, comment, fail_but_charge = false) ⇒ Object

Fail specific requests on this batch


160
161
162
163
164
165
166
167
168
169
170
171
# File 'app/models/batch.rb', line 160

def fail_requests(requests_to_fail, reason, comment, fail_but_charge = false)
  ActiveRecord::Base.transaction do
    requests.find(requests_to_fail).each do |request|
      logger.debug "SENDING FAIL FOR REQUEST #{request.id}, BATCH #{id}, WITH REASON #{reason}"

      request.customer_accepts_responsibility! if fail_but_charge
      request.failures.create(reason: reason, comment: comment, notify_remote: true)
      EventSender.send_fail_event(request, reason, comment, id)
    end
    update_batch_state(reason, comment)
  end
end

#failed?Boolean

Returns:

  • (Boolean)

181
182
183
# File 'app/models/batch.rb', line 181

def failed?
  production_state == 'fail'
end

#first_output_plateObject


269
270
271
# File 'app/models/batch.rb', line 269

def first_output_plate
  Plate.output_by_batch(self).with_wells_and_requests.first
end

#flowcellObject


123
124
125
# File 'app/models/batch.rb', line 123

def flowcell
  self if sequencing?
end

#has_control?Boolean

Returns:

  • (Boolean)

206
207
208
# File 'app/models/batch.rb', line 206

def has_control?
  control.present?
end

#has_event(event_name) ⇒ Object

Tests whether this Batch has any associated LabEvents


186
187
188
# File 'app/models/batch.rb', line 186

def has_event(event_name)
  lab_events.any? { |event| event_name.downcase == event.description.try(:downcase) }
end

#id_dupObject


302
303
304
# File 'app/models/batch.rb', line 302

def id_dup
  id
end

#input_labware_reportLabware::ActiveRecord_Relation

Returns a list of input labware including their barcodes, purposes, and a count of the number of requests associated with the batch. Output depends on Pipeline. Some pipelines return an empty relationship

Returns:

  • (Labware::ActiveRecord_Relation)

    The associated labware


237
238
239
# File 'app/models/batch.rb', line 237

def input_labware_report
  pipeline.input_labware requests
end

#input_plate_groupObject


250
251
252
# File 'app/models/batch.rb', line 250

def input_plate_group
  source_assets.group_by(&:plate)
end

#mpx_library_nameObject


291
292
293
294
295
296
# File 'app/models/batch.rb', line 291

def mpx_library_name
  return '' unless multiplexed? && requests.any?

  mpx_library_tube = requests.first.target_asset.children.first
  mpx_library_tube&.name || ''
end

#multiplexed_items_with_unique_library_idsObject


306
307
308
# File 'app/models/batch.rb', line 306

def multiplexed_items_with_unique_library_ids
  requests.map { |r| r.target_asset.children }.flatten.uniq
end

#npg_set_stateObject


531
532
533
534
535
536
537
# File 'app/models/batch.rb', line 531

def npg_set_state
  if all_requests_qced?
    self.state = 'released'
    qc_complete
    save!
  end
end

#output_labware_reportLabware::ActiveRecord_Relation

Returns a list of output labware including their barcodes, purposes, and a count of the number of requests associated with the batch. Output depends on Pipeline. Some pipelines return an empty relationship

Returns:

  • (Labware::ActiveRecord_Relation)

    The associated labware


246
247
248
# File 'app/models/batch.rb', line 246

def output_labware_report
  pipeline.output_labware requests.with_target
end

#output_plate_groupObject

This looks odd. Why would a request have the same asset as target asset? Why are we filtering them out here?


255
256
257
# File 'app/models/batch.rb', line 255

def output_plate_group
  requests.select { |r| r.target_asset != r.asset }.map(&:target_asset).select(&:present?).group_by(&:plate)
end

#output_plate_purposeObject


273
274
275
# File 'app/models/batch.rb', line 273

def output_plate_purpose
  output_plates[0].plate_purpose unless output_plates[0].nil?
end

#output_plate_roleObject


277
278
279
# File 'app/models/batch.rb', line 277

def output_plate_role
  requests.first.try(:role)
end

#output_platesObject


259
260
261
262
263
264
265
266
267
# File 'app/models/batch.rb', line 259

def output_plates
  # We use re-order here as batch_requests applies a default sort order to
  # the relationship, which takes preference, even though we're has_many throughing
  if output_labware.loaded?
    return output_labware.sort_by(&:id)
  end

  output_labware.reorder(id: :asc)
end

#parent_of_purpose(name) ⇒ Object


406
407
408
409
410
411
412
413
# File 'app/models/batch.rb', line 406

def parent_of_purpose(name)
  return nil if requests.empty?

  requests.first.asset.ancestors.joins(
    "INNER JOIN plate_purposes ON #{Plate.table_name}.plate_purpose_id = plate_purposes.id"
  )
          .find_by(plate_purposes: { name: name })
end

#pick_information?Boolean

Returns:

  • (Boolean)

552
553
554
# File 'app/models/batch.rb', line 552

def pick_information?
  pipeline.pick_information?(self)
end

#plate_barcode(barcode) ⇒ Object


287
288
289
# File 'app/models/batch.rb', line 287

def plate_barcode(barcode)
  barcode.presence || requests.first.target_asset.plate.human_barcode
end

#plate_group_barcodesObject


281
282
283
284
285
# File 'app/models/batch.rb', line 281

def plate_group_barcodes
  return nil unless pipeline.group_by_parent || requests.first.target_asset.is_a?(Well)

  output_plate_group.presence || input_plate_group
end

#plate_ids_in_study(study) ⇒ Object


454
455
456
# File 'app/models/batch.rb', line 454

def plate_ids_in_study(study)
  Plate.plate_ids_from_requests(requests.for_studies(study))
end

#rebroadcastObject


548
549
550
# File 'app/models/batch.rb', line 548

def rebroadcast
  messengers.each(&:queue_for_broadcast)
end

#release_pending_requestsObject


341
342
343
344
345
346
347
# File 'app/models/batch.rb', line 341

def release_pending_requests
  # We set the unused requests to pending.
  # this is to allow unused well to be cherry-picked again
  requests.each do |request|
    detach_request(request) if request.started?
  end
end

383
384
385
# File 'app/models/batch.rb', line 383

def remove_link(request)
  request.batch = nil
end

#remove_request_ids(request_ids, reason = nil, comment = nil) ⇒ Object Also known as: recycle_request_ids

Remove the request from the batch and remove asset information


350
351
352
353
354
355
356
357
358
# File 'app/models/batch.rb', line 350

def remove_request_ids(request_ids, reason = nil, comment = nil)
  ActiveRecord::Base.transaction do
    Request.find(request_ids).each do |request|
      request.failures.create(reason: reason, comment: comment, notify_remote: true)
      detach_request(request)
    end
    update_batch_state(reason, comment)
  end
end

#request_countObject


522
523
524
# File 'app/models/batch.rb', line 522

def request_count
  requests.count
end

#requests_have_same_read_lengthObject


133
134
135
136
137
# File 'app/models/batch.rb', line 133

def requests_have_same_read_length
  unless pipeline.is_read_length_consistent_for_batch?(self)
    errors.add :base, "The selected requests must have the same values in their 'Read length' field."
  end
end

#reset!(current_user) ⇒ Object


387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
# File 'app/models/batch.rb', line 387

def reset!(current_user)
  ActiveRecord::Base.transaction do
    discard!

    requests.each do |request|
      remove_link(request) # Remove link in all types of pipelines
      return_request_to_inbox(request, current_user)
    end

    if requests.last.submission_id.present?
      Request.where(submission_id: requests.last.submission_id, state: 'pending')
             .where.not(request_type_id: pipeline.request_type_ids).find_each do |request|
        request.asset_id = nil
        request.save!
      end
    end
  end
end

#return_request_to_inbox(request, current_user = nil) ⇒ Object


373
374
375
376
377
378
379
380
381
# File 'app/models/batch.rb', line 373

def return_request_to_inbox(request, current_user = nil)
  ActiveRecord::Base.transaction do
    unless current_user.nil?
      request.add_comment("Used to belong to Batch #{id} returned to inbox unstarted at #{Time.zone.now}",
                          current_user)
    end
    request.return_pending_to_inbox!
  end
end

#robot_idObject


194
195
196
# File 'app/models/batch.rb', line 194

def robot_id
  event_with_description('Cherrypick Layout Set')&.descriptor_value('robot_id')
end

#robot_verified!(user_id) ⇒ Object


470
471
472
473
474
475
476
# File 'app/models/batch.rb', line 470

def robot_verified!(user_id)
  return if has_event('robot verified')

  pipeline.robot_verified!(self)
  lab_events.create(description: 'Robot verified',
                    message: 'Robot verification completed and source volumes updated.', user_id: user_id)
end

#show_actions?Boolean

Returns:

  • (Boolean)

526
527
528
529
# File 'app/models/batch.rb', line 526

def show_actions?
  released? == false or
    pipeline.class.const_get(:ALWAYS_SHOW_RELEASE_ACTIONS)
end

#show_fail_link?Boolean

Returns:

  • (Boolean)

539
540
541
# File 'app/models/batch.rb', line 539

def show_fail_link?
  released? && sequencing?
end

#source_labwareObject

Source Labware returns the physical pieces of labware (ie. a plate for wells, but tubes for tubes)


311
312
313
# File 'app/models/batch.rb', line 311

def source_labware
  input_labware
end

#space_leftObject


458
459
460
# File 'app/models/batch.rb', line 458

def space_left
  [item_limit - batch_requests.count, 0].max
end

#start_requestsObject


228
229
230
# File 'app/models/batch.rb', line 228

def start_requests
  requests.with_assets_for_starting_requests.not_failed.map(&:start!)
end

#subject_typeObject


115
116
117
# File 'app/models/batch.rb', line 115

def subject_type
  sequencing? ? 'flowcell' : 'batch'
end

#swap(current_user, batch_info = {}) ⇒ Object


415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
# File 'app/models/batch.rb', line 415

def swap(current_user, batch_info = {})
  return false if batch_info.empty?

  # Find the two lanes that are to be swapped
  batch_request_left = BatchRequest.find_by(batch_id: batch_info['batch_1']['id'],
                                            position: batch_info['batch_1']['lane']) or errors.add('Swap: ',
                                                                                                   'The first lane cannot be found')
  batch_request_right = BatchRequest.find_by(batch_id: batch_info['batch_2']['id'],
                                             position: batch_info['batch_2']['lane']) or errors.add('Swap: ',
                                                                                                    'The second lane cannot be found')
  return unless batch_request_left.present? && batch_request_right.present?

  ActiveRecord::Base.transaction do
    # Update the lab events for the request so that they reference the batch that the request is moving to
    batch_request_left.request.lab_events.each do |event|
      event.update!(batch_id: batch_request_right.batch_id) if event.batch_id == batch_request_left.batch_id
    end
    batch_request_right.request.lab_events.each do |event|
      event.update!(batch_id: batch_request_left.batch_id)  if event.batch_id == batch_request_right.batch_id
    end

    # Swap the two batch requests so that they are correct.  This involves swapping both the batch and the lane but ensuring that the
    # two requests don't clash on position by removing one of them.
    original_left_batch_id, original_left_position, original_right_request_id = batch_request_left.batch_id, batch_request_left.position, batch_request_right.request_id
    batch_request_right.destroy
    batch_request_left.update!(batch_id: batch_request_right.batch_id, position: batch_request_right.position)
    batch_request_right = BatchRequest.create!(batch_id: original_left_batch_id, position: original_left_position,
                                               request_id: original_right_request_id)

    # Finally record the fact that the batch was swapped
    batch_request_left.batch.lab_events.create!(description: 'Lane swap',
                                                message: "Lane #{batch_request_right.position} moved to #{batch_request_left.batch_id} lane #{batch_request_left.position}", user_id: current_user.id)
    batch_request_right.batch.lab_events.create!(description: 'Lane swap',
                                                 message: "Lane #{batch_request_left.position} moved to #{batch_request_right.batch_id} lane #{batch_request_right.position}", user_id: current_user.id)
  end

  true
end

#total_volume_to_cherrypickObject


462
463
464
465
466
467
468
# File 'app/models/batch.rb', line 462

def total_volume_to_cherrypick
  request = requests.first
  return DEFAULT_VOLUME unless request.asset.is_a?(Well)
  return DEFAULT_VOLUME unless request.target_asset.is_a?(Well)

  request.target_asset.get_requested_volume
end

#underrunObject


198
199
200
# File 'app/models/batch.rb', line 198

def underrun
  has_limit? ? (item_limit - batch_requests.size) : 0
end

#update_batch_state(reason, comment) ⇒ Object


173
174
175
176
177
178
179
# File 'app/models/batch.rb', line 173

def update_batch_state(reason, comment)
  if requests.all?(&:terminated?)
    failures.create(reason: reason, comment: comment, notify_remote: false)
    self.production_state = 'fail'
    save!
  end
end

#verify_tube_layout(barcodes, user = nil) ⇒ Bool

Verifies that provided barcodes are in the correct locations according to the request organization within the batch. Either returns true, and logs the event or returns false.

Parameters:

  • barcodes (Array<Integer>)

    An array of 1-7 digit long barcodes

  • user (User) (defaults to: nil)

    The user validating the barcode layout

Returns:

  • (Bool)

    true if the layout is correct, false otherwise


325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
# File 'app/models/batch.rb', line 325

def verify_tube_layout(barcodes, user = nil)
  requests.each do |request|
    barcode = barcodes[request.position - 1]
    unless barcode == request.asset.machine_barcode
      expected_barcode = request.asset.human_barcode
      errors.add(:base, "The tube at position #{request.position} is incorrect: expected #{expected_barcode}.")
    end
  end
  if errors.empty?
    lab_events.create(description: 'Tube layout verified', user: user)
    true
  else
    false
  end
end