Method: ActiveRecord::Batches#find_each
- Defined in:
- lib/active_record/relation/batches.rb
#find_each(start: nil, finish: nil, batch_size: 1000, error_on_ignore: nil, cursor: primary_key, order: DEFAULT_ORDER, &block) ⇒ Object
Looping through a collection of records from the database (using the Scoping::Named::ClassMethods.all method, for example) is very inefficient since it will try to instantiate all the objects at once.
In that case, batch processing methods allow you to work with the records in batches, thereby greatly reducing memory consumption.
The #find_each method uses #find_in_batches with a batch size of 1000 (or as specified by the :batch_size
option).
Person.find_each do |person|
person.do_awesome_stuff
end
Person.where("age > 21").find_each do |person|
person.party_all_night!
end
If you do not provide a block to #find_each, it will return an Enumerator for chaining with other methods:
Person.find_each.with_index do |person, index|
person.award_trophy(index + 1)
end
Options
-
:batch_size
- Specifies the size of the batch. Defaults to 1000. -
:start
- Specifies the cursor column value to start from, inclusive of the value. -
:finish
- Specifies the cursor column value to end at, inclusive of the value. -
:error_on_ignore
- Overrides the application config to specify if an error should be raised when an order is present in the relation. -
:cursor
- Specifies the column to use for batching (can be a column name or an array of column names). Defaults to primary key. -
:order
- Specifies the cursor column order (can be:asc
or:desc
or an array consisting of :asc or :desc). Defaults to:asc
.class Order < ActiveRecord::Base self.primary_key = [:id_1, :id_2] end Order.find_each(order: [:asc, :desc])
In the above code,
id_1
is sorted in ascending order andid_2
in descending order.
Limits are honored, and if present there is no requirement for the batch size: it can be less than, equal to, or greater than the limit.
The options start
and finish
are especially useful if you want multiple workers dealing with the same processing queue. You can make worker 1 handle all the records between id 1 and 9999 and worker 2 handle from 10000 and beyond by setting the :start
and :finish
option on each worker.
# In worker 1, let's process until 9999 records.
Person.find_each(finish: 9_999) do |person|
person.party_all_night!
end
# In worker 2, let's process from record 10_000 and onwards.
Person.find_each(start: 10_000) do |person|
person.party_all_night!
end
NOTE: Order can be ascending (:asc) or descending (:desc). It is automatically set to ascending on the primary key (“id ASC”). This also means that this method only works when the cursor column is orderable (e.g. an integer or string).
NOTE: When using custom columns for batching, they should include at least one unique column (e.g. primary key) as a tiebreaker. Also, to reduce the likelihood of race conditions, all columns should be static (unchangeable after it was set).
NOTE: By its nature, batch processing is subject to race conditions if other processes are modifying the database.
85 86 87 88 89 90 91 92 93 94 95 96 97 |
# File 'lib/active_record/relation/batches.rb', line 85 def find_each(start: nil, finish: nil, batch_size: 1000, error_on_ignore: nil, cursor: primary_key, order: DEFAULT_ORDER, &block) if block_given? find_in_batches(start: start, finish: finish, batch_size: batch_size, error_on_ignore: error_on_ignore, cursor: cursor, order: order) do |records| records.each(&block) end else enum_for(:find_each, start: start, finish: finish, batch_size: batch_size, error_on_ignore: error_on_ignore, cursor: cursor, order: order) do relation = self cursor = Array(cursor) apply_limits(relation, cursor, start, finish, build_batch_orders(cursor, order)).size end end end |