Class: Gitlab::BackgroundMigration::CleanupOptimisticLockingNulls
- Inherits:
-
Object
- Object
- Gitlab::BackgroundMigration::CleanupOptimisticLockingNulls
- Defined in:
- lib/gitlab/background_migration/cleanup_optimistic_locking_nulls.rb
Constant Summary collapse
- QUERY_ITEM_SIZE =
1_000
Instance Method Summary collapse
- #define_model_for(table) ⇒ Object
-
#perform(start_id, stop_id, table) ⇒ Object
table - The name of the table the migration is performed for.
Instance Method Details
#define_model_for(table) ⇒ Object
25 26 27 28 29 |
# File 'lib/gitlab/background_migration/cleanup_optimistic_locking_nulls.rb', line 25 def define_model_for(table) Class.new(ActiveRecord::Base) do self.table_name = table end end |
#perform(start_id, stop_id, table) ⇒ Object
table - The name of the table the migration is performed for. start_id - The ID of the object to start at stop_id - The ID of the object to end at
12 13 14 15 16 17 18 19 20 21 22 23 |
# File 'lib/gitlab/background_migration/cleanup_optimistic_locking_nulls.rb', line 12 def perform(start_id, stop_id, table) model = define_model_for(table) # After analysis done, a batch size of 1,000 items per query was found to be # the most optimal. Discussion in https://gitlab.com/gitlab-org/gitlab/-/merge_requests/18418#note_282285336 (start_id..stop_id).each_slice(QUERY_ITEM_SIZE).each do |range| model .where(lock_version: nil) .where("ID BETWEEN ? AND ?", range.first, range.last) .update_all(lock_version: 0) end end |