Class: BackProp::Value
- Inherits:
-
Object
- Object
- BackProp::Value
- Defined in:
- lib/backprop.rb
Instance Attribute Summary collapse
-
#backstep ⇒ Object
Returns the value of attribute backstep.
-
#children ⇒ Object
readonly
Returns the value of attribute children.
-
#gradient ⇒ Object
Returns the value of attribute gradient.
-
#label ⇒ Object
Returns the value of attribute label.
-
#op ⇒ Object
Returns the value of attribute op.
-
#value ⇒ Object
Returns the value of attribute value.
Class Method Summary collapse
Instance Method Summary collapse
- #*(other) ⇒ Object
-
#**(other) ⇒ Object
Mostly we are squaring(2) or dividing(-1) We don’t support expressions, so Value is not supported for other This will look like a unary op in the tree.
-
#+(other) ⇒ Object
Primary operations; notice every Value.new(op:) also defines a backstep The backstep closes over the environment of the method so it can refer to values present when the method executes.
-
#-(other) ⇒ Object
Secondary operations defined in terms of primary These return differentiable Values but with more steps.
- #/(other) ⇒ Object
-
#backprop ⇒ Object
recursive call; visits all descendants; updates gradients via backstep.
-
#backward ⇒ Object
Generally, this is called on the final output, say of a loss function It will initialize the gradients and then update the gradients on all dependent Values via back propagation.
- #descend(step_size = 0.1) ⇒ Object
- #descend_recursive(step_size = 0.1) ⇒ Object
- #display ⇒ Object
-
#exp ⇒ Object
e^x - unary operation.
-
#initialize(float, label: '', op: nil, children: []) ⇒ Value
constructor
A new instance of Value.
- #inspect ⇒ Object
-
#relu ⇒ Object
rectified linear unit; not susceptible to vanishing gradient like above.
-
#reset_gradient ⇒ Object
recursive call; visits all descendants; sets gradient to zero.
-
#sigmoid ⇒ Object
1 / 1 + e^-x.
-
#tanh ⇒ Object
Activation functions Unary operations.
- #to_s ⇒ Object
Constructor Details
#initialize(float, label: '', op: nil, children: []) ⇒ Value
Returns a new instance of Value.
10 11 12 13 14 15 16 17 18 19 20 21 22 |
# File 'lib/backprop.rb', line 10 def initialize(float, label: '', op: nil, children: []) @value = float.to_f @gradient = 0 @children = children if @children.empty? raise "op #{op.inspect} has no children" unless op.nil? else raise "op is required" if op.nil? end @op = op @label = label @backstep = -> {} end |
Instance Attribute Details
#backstep ⇒ Object
Returns the value of attribute backstep.
8 9 10 |
# File 'lib/backprop.rb', line 8 def backstep @backstep end |
#children ⇒ Object (readonly)
Returns the value of attribute children.
7 8 9 |
# File 'lib/backprop.rb', line 7 def children @children end |
#gradient ⇒ Object
Returns the value of attribute gradient.
8 9 10 |
# File 'lib/backprop.rb', line 8 def gradient @gradient end |
#label ⇒ Object
Returns the value of attribute label.
8 9 10 |
# File 'lib/backprop.rb', line 8 def label @label end |
#op ⇒ Object
Returns the value of attribute op.
8 9 10 |
# File 'lib/backprop.rb', line 8 def op @op end |
#value ⇒ Object
Returns the value of attribute value.
8 9 10 |
# File 'lib/backprop.rb', line 8 def value @value end |
Class Method Details
.wrap(other) ⇒ Object
3 4 5 |
# File 'lib/backprop.rb', line 3 def self.wrap(other) other.is_a?(Value) ? other : Value.new(other) end |
Instance Method Details
#*(other) ⇒ Object
66 67 68 69 70 71 72 73 74 75 |
# File 'lib/backprop.rb', line 66 def *(other) other = Value.wrap(other) val = Value.new(@value * other.value, children: [self, other], op: :*) val.backstep = -> { # derivative of multiplication is the opposite term self.gradient += val.gradient * other.value other.gradient += val.gradient * self.value } val end |
#**(other) ⇒ Object
Mostly we are squaring(2) or dividing(-1) We don’t support expressions, so Value is not supported for other This will look like a unary op in the tree
80 81 82 83 84 85 86 87 88 |
# File 'lib/backprop.rb', line 80 def **(other) raise("Value is not supported") if other.is_a? Value val = Value.new(@value ** other, children: [self], op: :**) val.backstep = -> { # accumulate, chain rule, derivative; as before self.gradient += val.gradient * (other * self.value ** (other - 1)) } val end |
#+(other) ⇒ Object
Primary operations; notice every Value.new(op:) also defines a backstep
The backstep closes over the environment of the method so it can
refer to values present when the method executes
46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 |
# File 'lib/backprop.rb', line 46 def +(other) other = Value.wrap(other) val = Value.new(@value + other.value, children: [self, other], op: :+) # What we're about to do here is pretty twisted. We're going to refer # to this execution context in the definition of a lambda, but we'll # evaluate it later. # Backstep is a lambda attached to val, which will be the return value # here. When val.backstep is called later, it will update the gradients # on both self and other. val.backstep = -> { # gradients accumulate for handling a term used more than once # chain rule says to multiply val's gradient and the op's derivative # derivative of addition is 1.0; pass val's gradient to children self.gradient += val.gradient other.gradient += val.gradient } val end |
#-(other) ⇒ Object
Secondary operations defined in terms of primary These return differentiable Values but with more steps
104 105 106 |
# File 'lib/backprop.rb', line 104 def -(other) self + (Value.wrap(other) * Value.new(-1)) end |
#/(other) ⇒ Object
108 109 110 |
# File 'lib/backprop.rb', line 108 def /(other) self * (Value.wrap(other) ** -1) end |
#backprop ⇒ Object
recursive call; visits all descendants; updates gradients via backstep
161 162 163 164 165 |
# File 'lib/backprop.rb', line 161 def backprop self.backstep.call @children.each(&:backprop) self end |
#backward ⇒ Object
Generally, this is called on the final output, say of a loss function It will initialize the gradients and then update the gradients on all dependent Values via back propagation
147 148 149 150 151 |
# File 'lib/backprop.rb', line 147 def backward self.reset_gradient # set gradient to zero on all descendants @gradient = 1.0 # this node's gradient is 1.0 self.backprop # call backstep on all descendants end |
#descend(step_size = 0.1) ⇒ Object
167 168 169 |
# File 'lib/backprop.rb', line 167 def descend(step_size = 0.1) @value += -1 * step_size * @gradient end |
#descend_recursive(step_size = 0.1) ⇒ Object
171 172 173 174 175 |
# File 'lib/backprop.rb', line 171 def descend_recursive(step_size = 0.1) self.descend(step_size) @children.each { |c| c.descend_recursive(step_size) } self end |
#display ⇒ Object
28 29 30 31 32 33 |
# File 'lib/backprop.rb', line 28 def display format("%s(%.3f gradient=%.3f", @label.empty? ? @op || 'Value' : @label, @value, @gradient) + (@op.nil? ? '' : format(" %s(%s)", @op, @children.join(', '))) + ')' end |
#exp ⇒ Object
e^x - unary operation
91 92 93 94 95 96 97 |
# File 'lib/backprop.rb', line 91 def exp val = Value.new(Math.exp(@value), children: [self], op: :exp) val.backstep = -> { self.gradient += val.gradient * val.value } val end |
#inspect ⇒ Object
35 36 37 38 |
# File 'lib/backprop.rb', line 35 def inspect @children.empty? ? self.display : [self.display, @children.map(&:inspect).join("\n\t")].join("\n\t") end |
#relu ⇒ Object
rectified linear unit; not susceptible to vanishing gradient like above
131 132 133 134 135 136 137 138 |
# File 'lib/backprop.rb', line 131 def relu neg = @value < 0 val = Value.new(neg ? 0 : @value, children: [self], op: :relu) val.backstep = -> { self.gradient += val.gradient * (neg ? 0 : 1) } val end |
#reset_gradient ⇒ Object
recursive call; visits all descendants; sets gradient to zero
154 155 156 157 158 |
# File 'lib/backprop.rb', line 154 def reset_gradient @gradient = 0.0 @children.each(&:reset_gradient) self end |
#sigmoid ⇒ Object
1 / 1 + e^-x
126 127 128 |
# File 'lib/backprop.rb', line 126 def sigmoid ((self * -1).exp + 1) ** -1 end |
#tanh ⇒ Object
Activation functions Unary operations
117 118 119 120 121 122 123 |
# File 'lib/backprop.rb', line 117 def tanh val = Value.new(Math.tanh(@value), children: [self], op: :tanh) val.backstep = -> { self.gradient += val.gradient * (1 - val.value ** 2) } val end |
#to_s ⇒ Object
24 25 26 |
# File 'lib/backprop.rb', line 24 def to_s @label.empty? ? ("%.3f" % @value) : format("%s=%.3f", @label, @value) end |