Class: Silicium::Regression::LinearRegressionByGradientDescent

Inherits:
Object
  • Object
show all
Defined in:
lib/regression.rb

Class Method Summary collapse

Class Method Details

.generate_function(plot, alpha = 0.01, start_theta0 = 0.0, start_theta1 = 0.0) ⇒ Numeric

Finds parameters theta0, theta1 for equation theta0 + theta1 * x for linear regression of given plot of one variable

Parameters:

  • plot

    Actually hash x => y for different points of the plot

  • alpha (defaults to: 0.01)

    Speed of learning (should be little enough not to diverge)

  • start_theta0 (defaults to: 0.0)

    Starting value of theta0

  • start_theta1 (defaults to: 0.0)

    Starting value of theta1

Returns:

  • (Numeric)

    theta0, theta1



11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
# File 'lib/regression.rb', line 11

def self.generate_function(plot, alpha = 0.01, start_theta0 = 0.0, start_theta1 = 0.0)
  theta0 = start_theta0
  theta1 = start_theta1
  m = plot.length.to_f
  epsilon = 0.000001
  bias_new = 5.0
  while bias_new.abs() > epsilon
    old_theta0, old_theta1 = theta0, theta1
    oth = theta0
    theta0 = theta0 - alpha / m * d_dt_for_theta0(plot, theta0, theta1)
    theta1 = theta1 - alpha / m * d_dt_for_theta1(plot, oth, theta1)
    bias_new = [(theta0 - old_theta0).abs(), (theta1 - old_theta1).abs()].max()
  end
  return theta0, theta1
end