Class: FeldtRuby::Optimize::DEOptimizerBase
- Inherits:
-
EvolutionaryOptimizer
- Object
- Optimizer
- PopulationBasedOptimizer
- EvolutionaryOptimizer
- FeldtRuby::Optimize::DEOptimizerBase
- Defined in:
- lib/feldtruby/optimize/differential_evolution.rb
Overview
Base class for Differential Evolution (DE) for continuous, real-valued optimization. Since there are many different DE variants this is the base class from which we can then include different strategy parts and create complete DE classes.
A DE strategy generates a new trial vector as a candidate to replace a parent vector. It is composed of four parts:
- a mutation strategy that samples a set of parents to create a donor vector
- a crossover strategy which takes a donor and parent vector and creates a trial vector
- a bounding strategy which ensures the trial vector is within the search space
- an update strategy which can be used to self-adapt parameters based on feedback on improvements
A strategy gets feedback on whether the latest trial vector was an improvement. It can use this feedback to adapt its operation over time.
We implement strategies as Ruby Module’s that we can include in different DE optimizer classes that inherits form the base one above. For maximum flexibility, each of the four parts of a DE strategy are implemented in separate Module’s so we can mix and match them.
Direct Known Subclasses
Constant Summary collapse
- DefaultOptions =
{ :DE_F_ScaleFactor => 0.7, :DE_CR_CrossoverRate => 0.5, :DE_NumParentsToSample => 4, }
Instance Attribute Summary
Attributes inherited from PopulationBasedOptimizer
Attributes inherited from Optimizer
#archive, #num_optimization_steps, #objective, #options, #search_space, #termination_criterion
Attributes included from Logging
Instance Method Summary collapse
-
#candidate_from_array(ary) ⇒ Object
Create a candidate from an array.
-
#crossover_rate(position) ⇒ Object
Crossover rate.
-
#generate_trial_candidate_and_target ⇒ Object
Main entry point for a DEStrategy.
- #initialize_options(options) ⇒ Object
-
#initialize_population(sizeOfPopulation) ⇒ Object
Create a population of a given size by randomly sampling candidates from the search space and converting them to Vector’s so we can more easily calculate on them later.
-
#num_parents_to_sample ⇒ Object
Number of parents to sample.
-
#optimization_step ⇒ Object
One step of the optimization is to (try to) update one vector.
-
#sample_parents ⇒ Object
Sample parents from the population and return their indices.
-
#scale_factor(targetVectorIndex) ⇒ Object
Scale factor F.
Methods inherited from PopulationBasedOptimizer
#get_candidate, #get_candidates_with_indices, #population_size, #re_initialize_population, #sample_population_indices_without_replacement, #update_candidate_in_population
Methods inherited from Optimizer
#best, #init_archive, #initialize, #log_end_of_optimization, #optimize, #time_per_step, #update_archive
Methods included from Logging
#__find_logger_set_on_instance_vars, #new_default_logger, #setup_logger_and_distribute_to_instance_variables
Constructor Details
This class inherits a constructor from FeldtRuby::Optimize::Optimizer
Instance Method Details
#candidate_from_array(ary) ⇒ Object
Create a candidate from an array. By default we represent candidates with Ruby vectors since they allow vector-based artihmetic.
51 52 53 |
# File 'lib/feldtruby/optimize/differential_evolution.rb', line 51 def candidate_from_array(ary) Vector.elements(ary) end |
#crossover_rate(position) ⇒ Object
Crossover rate. Default is to use the one set in the optimizer, regardless of position of the crossover position.
94 |
# File 'lib/feldtruby/optimize/differential_evolution.rb', line 94 def crossover_rate(position); @cr; end |
#generate_trial_candidate_and_target ⇒ Object
Main entry point for a DEStrategy. Generates a new trial vector and the parent it targets.
103 104 105 106 107 108 109 110 111 112 113 114 115 116 |
# File 'lib/feldtruby/optimize/differential_evolution.rb', line 103 def generate_trial_candidate_and_target() # Sample parents. The first parent returned is used as target parent to cross-over with. # Rest of the sampled parents is/can be used in mutation. target_parent_index, *parent_indices = sample_parents() target = get_candidate(target_parent_index) # The three main steps. We get feedback from optimizer at a later stage. donor = mutate(target_parent_index, parent_indices) # Should be implemented by a MutationStrategy trial = crossover_donor_and_target(target, donor, target_parent_index) # Should be implemented by a CrossoverStrategy trial = bound_trial_candidate(trial) # Should be implemented by a BoundingStrategy return trial, target, target_parent_index end |
#initialize_options(options) ⇒ Object
35 36 37 38 39 40 41 |
# File 'lib/feldtruby/optimize/differential_evolution.rb', line 35 def () super @options = DefaultOptions.clone.update() @f = @scale_factor = @options[:DE_F_ScaleFactor] @cr = @crossover_rate = @options[:DE_CR_CrossoverRate] @num_parents_to_sample = @options[:DE_NumParentsToSample] end |
#initialize_population(sizeOfPopulation) ⇒ Object
Create a population of a given size by randomly sampling candidates from the search space and converting them to Vector’s so we can more easily calculate on them later.
45 46 47 |
# File 'lib/feldtruby/optimize/differential_evolution.rb', line 45 def initialize_population(sizeOfPopulation) @population = Array.new(sizeOfPopulation).map {Vector.elements(search_space.gen_candidate())} end |
#num_parents_to_sample ⇒ Object
Number of parents to sample. Default is that this is constant but can be overriden by a mutation strategy.
86 |
# File 'lib/feldtruby/optimize/differential_evolution.rb', line 86 def num_parents_to_sample; [:DE_NumParentsToSample]; end |
#optimization_step ⇒ Object
One step of the optimization is to (try to) update one vector. Thus, this is more of a steady-state than a generational EC. DE is typically a generational EC but it is hard to see any reason why. The default DE here is the classic DE/rand/1/*
58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 |
# File 'lib/feldtruby/optimize/differential_evolution.rb', line 58 def optimization_step() trial, target, target_index = generate_trial_candidate_and_target() best, worst = objective.rank_candidates([target, trial]) # Supplant the target vector with the trial vector if trial vector is better. if best != target logger.log_data :better_candidate_found, { "Trial" => trial, "Trial Quality" => @objective.quality_of(trial), "Target" => target, "Target Quality" => @objective.quality_of(target) }, "DE (step #{@num_optimization_steps}): Trial vector was better than target vector" update_candidate_in_population(target_index, trial) feedback_on_trial_vs_target(trial, target, true) [best] else feedback_on_trial_vs_target(trial, target, false) [] end end |
#sample_parents ⇒ Object
Sample parents from the population and return their indices.
97 98 99 |
# File 'lib/feldtruby/optimize/differential_evolution.rb', line 97 def sample_parents() sample_population_indices_without_replacement(num_parents_to_sample) end |
#scale_factor(targetVectorIndex) ⇒ Object
Scale factor F. Default is to use the one set in the optimizer, regardless of target vector.
90 |
# File 'lib/feldtruby/optimize/differential_evolution.rb', line 90 def scale_factor(targetVectorIndex); @f; end |