Class: GLM::Base
- Inherits:
-
Object
- Object
- GLM::Base
- Defined in:
- lib/glm/base.rb
Constant Summary collapse
- @@initial_weight =
1
Class Method Summary collapse
-
.g(eta) ⇒ Object
Canonical reponse function, intended to be overriden.
Instance Method Summary collapse
-
#a ⇒ Object
Log partition function a(eta), intended to be overriden.
-
#b ⇒ Object
intended to be overriden.
-
#est(x) ⇒ Object
Estimator =Arguments: x: a feature vector in Array =Returns: Estimation.
-
#eta(x) ⇒ Object
Natural parameter eta.
- #format(x) ⇒ Object
-
#gradient(x, y, v) ⇒ Object
Gradient on one sample.
-
#h(x) ⇒ Object
Hypothesis function, outputs E(y|theta, x), mean of y given x parameterized by theta =Parameters: x: a feature vector =Returns: E(y|theta, x).
-
#initialize(x, y, alpha = 0.05) ⇒ Base
constructor
A new instance of Base.
-
#output(x) ⇒ Object
Output estimation from E(y|theta,x) Need overriding, except for plain linear regression.
-
#single_update ⇒ Object
A step based on one sample in stochastic gradient descent.
-
#sto_update ⇒ Object
One complete loop of stochastic gradient descend.
-
#T ⇒ Object
Sufficient statistic T.
- #theta ⇒ Object
Constructor Details
#initialize(x, y, alpha = 0.05) ⇒ Base
Returns a new instance of Base.
3 4 5 6 7 8 |
# File 'lib/glm/base.rb', line 3 def initialize(x,y,alpha = 0.05) @x = x @y = y @@alpha = alpha @theta = GSL::Vector.alloc(Array.new(x.size2, @@initial_weight)) end |
Class Method Details
.g(eta) ⇒ Object
Canonical reponse function, intended to be overriden
59 60 61 |
# File 'lib/glm/base.rb', line 59 def self.g(eta) raise 'Canonical reponse function g(eta) undefined' end |
Instance Method Details
#a ⇒ Object
Log partition function a(eta), intended to be overriden
11 12 13 |
# File 'lib/glm/base.rb', line 11 def a raise 'Log partition function a(eta) undefined' end |
#b ⇒ Object
intended to be overriden
16 17 18 |
# File 'lib/glm/base.rb', line 16 def b raise 'b undefined' end |
#est(x) ⇒ Object
Estimator
Arguments:
x: a feature vector in Array
Returns:
Estimation
36 37 38 |
# File 'lib/glm/base.rb', line 36 def est(x) format(x) end |
#eta(x) ⇒ Object
Natural parameter eta
47 48 49 50 |
# File 'lib/glm/base.rb', line 47 def eta(x) tmp = @theta * x.transpose return tmp end |
#format(x) ⇒ Object
20 21 22 23 24 25 26 27 28 29 |
# File 'lib/glm/base.rb', line 20 def format(x) if x.is_a? GSL::Vector return output(x) elsif x.is_a? GSL::Matrix tmp = GSL::Vector.alloc x.size1 (0...x.size1).each {|i| tmp[i]= output(x.row(i))} return tmp end end |
#gradient(x, y, v) ⇒ Object
Gradient on one sample
64 65 66 67 68 |
# File 'lib/glm/base.rb', line 64 def gradient(x,y,v) tmp = h(v) res = (y - tmp) * x return res end |
#h(x) ⇒ Object
Hypothesis function, outputs E(y|theta, x), mean of y given x parameterized by theta
Parameters:
x: a feature vector
Returns:
E(y|theta, x)
75 76 77 78 |
# File 'lib/glm/base.rb', line 75 def h(x) tmp = eta(x) return self.class.g(tmp) end |
#output(x) ⇒ Object
Output estimation from E(y|theta,x) Need overriding, except for plain linear regression
42 43 44 |
# File 'lib/glm/base.rb', line 42 def output(x) return h(x) end |
#single_update ⇒ Object
A step based on one sample in stochastic gradient descent
81 82 83 |
# File 'lib/glm/base.rb', line 81 def single_update() end |
#sto_update ⇒ Object
One complete loop of stochastic gradient descend
86 87 88 89 90 91 92 93 94 |
# File 'lib/glm/base.rb', line 86 def sto_update() (0...(@x.size1)).each do |i| (0...(@x.size2)).each do |j| updates = gradient(@x[i,j], @y[i], @x.row(i)) @theta[j] = @theta[j] + @@alpha * updates end end pp @theta end |
#T ⇒ Object
Sufficient statistic T
54 55 56 |
# File 'lib/glm/base.rb', line 54 def T return @y end |
#theta ⇒ Object
96 97 98 |
# File 'lib/glm/base.rb', line 96 def theta() return @theta end |