Class: Neuronet::Neuron
- Inherits:
-
Object
- Object
- Neuronet::Neuron
- Defined in:
- lib/neuronet/neuron.rb
Overview
A Neuron is capable of creating connections to other neurons. The connections attribute is a list of the neuron’s connections to other neurons. A neuron’s bias is it’s kicker (or deduction) to it’s activation value, a sum of its connections values.
Class Attribute Summary collapse
-
.label ⇒ Object
Returns the value of attribute label.
Instance Attribute Summary collapse
-
#activation ⇒ Object
readonly
Returns the value of attribute activation.
-
#bias ⇒ Object
Returns the value of attribute bias.
-
#connections ⇒ Object
readonly
Returns the value of attribute connections.
-
#label ⇒ Object
readonly
Returns the value of attribute label.
Class Method Summary collapse
-
.mju(neuron) ⇒ Object
Full recursive implementation of mju:.
Instance Method Summary collapse
-
#backpropagate(error) ⇒ Object
The backpropagate method modifies the neuron’s bias in proportion to the given error and passes on this error to each of its connection’s backpropagate method.
-
#connect(neuron = Neuron.new, weight: 0.0) ⇒ Object
Connects the neuron to another neuron.
-
#derivative ⇒ Object
𝓓𝒗⌈𝒗 = (1-⌈𝒗)⌈𝒗 = (1-𝒂)𝒂 = 𝓑𝒂.
-
#initialize(value = 0.0, bias: 0.0, connections: []) ⇒ Neuron
constructor
The initialize method sets the neuron’s value, bias and connections.
-
#inspect ⇒ Object
Tacks on to neuron’s inspect method to show the neuron’s bias and connections.
-
#iota ⇒ Object
𝜾 := 𝜧 𝜧‘ 𝝁“ = 𝜧 𝜿’.
-
#kappa ⇒ Object
𝜿 := 𝜧 𝝁‘ = 𝑾 𝓑𝒂’𝝁‘ = 𝑾 𝝀’ def kappa = mju(&:mu).
-
#lamda ⇒ Object
𝝀 = 𝓑𝒂𝛍.
-
#mju(&block) ⇒ Object
Reference the library’s wiki: 𝒆ₕ ~ 𝜀(𝝁ₕ + 𝜧ₕⁱ𝝁ᵢ + 𝜧ₕⁱ𝜧ᵢʲ𝝁ⱼ + 𝜧ₕⁱ𝜧ᵢʲ𝜧ⱼᵏ𝝁ₖ + …) 𝜧ₕⁱ𝝁ᵢ is: neuron.mju{ |connected_neuron| connected_neuron.mu } 𝜧ₕⁱ𝜧ᵢʲ𝝁ⱼ is: nh.mju{ |ni| ni.mju{ |nj| nj.mu }}.
-
#mu ⇒ Object
The neuron’s mu is the sum of the connections’ mu(activation), plus one for the bias: 𝛍 := 1+∑𝐚‘.
-
#partial ⇒ Object
For when connections are already updated, Neuron#partial updates the activation with the current values of bias and connections.
-
#to_s ⇒ Object
A neuron plainly puts itself as it’s label.
-
#update ⇒ Object
Updates the activation with the current value of bias and updated values of connections.
-
#value ⇒ Object
The “real world” value of the neuron is the unsquashed activation value.
-
#value=(value) ⇒ Object
One can explicitly set the neuron’s value, typically used to set the input neurons.
Constructor Details
#initialize(value = 0.0, bias: 0.0, connections: []) ⇒ Neuron
The initialize method sets the neuron’s value, bias and connections.
72 73 74 75 76 77 78 |
# File 'lib/neuronet/neuron.rb', line 72 def initialize(value = 0.0, bias: 0.0, connections: []) self.value = value @connections = connections @bias = bias @label = Neuron.label Neuron.label = Neuron.label.next end |
Class Attribute Details
.label ⇒ Object
Returns the value of attribute label.
12 13 14 |
# File 'lib/neuronet/neuron.rb', line 12 def label @label end |
Instance Attribute Details
#activation ⇒ Object (readonly)
Returns the value of attribute activation.
15 16 17 |
# File 'lib/neuronet/neuron.rb', line 15 def activation @activation end |
#bias ⇒ Object
Returns the value of attribute bias.
16 17 18 |
# File 'lib/neuronet/neuron.rb', line 16 def bias @bias end |
#connections ⇒ Object (readonly)
Returns the value of attribute connections.
15 16 17 |
# File 'lib/neuronet/neuron.rb', line 15 def connections @connections end |
#label ⇒ Object (readonly)
Returns the value of attribute label.
15 16 17 |
# File 'lib/neuronet/neuron.rb', line 15 def label @label end |
Class Method Details
.mju(neuron) ⇒ Object
Full recursive implementation of mju:
38 39 40 41 42 |
# File 'lib/neuronet/neuron.rb', line 38 def self.mju(neuron) return 0.0 if neuron.connections.empty? neuron.mu + neuron.mju { |connected_neuron| Neuron.mju(connected_neuron) } end |
Instance Method Details
#backpropagate(error) ⇒ Object
The backpropagate method modifies the neuron’s bias in proportion to the given error and passes on this error to each of its connection’s backpropagate method. While updates flows from input to output, back- propagation of errors flows from output to input.
107 108 109 110 111 112 113 114 115 116 |
# File 'lib/neuronet/neuron.rb', line 107 def backpropagate(error) return self if @connections.empty? @bias += Neuronet.noise[error] if @bias.abs > Neuronet.maxb @bias = @bias.positive? ? Neuronet.maxb : -Neuronet.maxb end @connections.each { |connection| connection.backpropagate(error) } self end |
#connect(neuron = Neuron.new, weight: 0.0) ⇒ Object
Connects the neuron to another neuron. The default weight=0 means there is no initial association. The connect method is how the implementation adds a connection, the way to connect a neuron to another. To connect “output” to “input”, for example, it is:
input = Neuronet::Neuron.new
output = Neuronet::Neuron.new
output.connect(input)
Think “output” connects to “input”.
126 127 128 129 130 |
# File 'lib/neuronet/neuron.rb', line 126 def connect(neuron = Neuron.new, weight: 0.0) @connections.push(Connection.new(neuron, weight:)) # Note that we're returning the connected neuron: neuron end |
#derivative ⇒ Object
𝓓𝒗⌈𝒗 = (1-⌈𝒗)⌈𝒗 = (1-𝒂)𝒂 = 𝓑𝒂
45 |
# File 'lib/neuronet/neuron.rb', line 45 def derivative = Neuronet.derivative[@activation] |
#inspect ⇒ Object
Tacks on to neuron’s inspect method to show the neuron’s bias and connections.
134 135 136 137 138 139 140 141 |
# File 'lib/neuronet/neuron.rb', line 134 def inspect fmt = Neuronet.format if @connections.empty? "#{@label}:#{fmt % value}" else "#{@label}:#{fmt % value}|#{[(fmt % @bias), *@connections].join('+')}" end end |
#iota ⇒ Object
𝜾 := 𝜧 𝜧‘ 𝝁“ = 𝜧 𝜿’
55 |
# File 'lib/neuronet/neuron.rb', line 55 def iota = mju(&:kappa) |
#kappa ⇒ Object
𝜿 := 𝜧 𝝁‘ = 𝑾 𝓑𝒂’𝝁‘ = 𝑾 𝝀’ def kappa = mju(&:mu)
52 |
# File 'lib/neuronet/neuron.rb', line 52 def kappa = @connections.sum(&:kappa) |
#lamda ⇒ Object
𝝀 = 𝓑𝒂𝛍
48 |
# File 'lib/neuronet/neuron.rb', line 48 def lamda = derivative * mu |
#mju(&block) ⇒ Object
Reference the library’s wiki:
𝒆ₕ ~ 𝜀(𝝁ₕ + 𝜧ₕⁱ𝝁ᵢ + 𝜧ₕⁱ𝜧ᵢʲ𝝁ⱼ + 𝜧ₕⁱ𝜧ᵢʲ𝜧ⱼᵏ𝝁ₖ + ...)
𝜧ₕⁱ𝝁ᵢ is:
neuron.mju{ |connected_neuron| connected_neuron.mu }
𝜧ₕⁱ𝜧ᵢʲ𝝁ⱼ is:
nh.mju{ |ni| ni.mju{ |nj| nj.mu }}
33 34 35 |
# File 'lib/neuronet/neuron.rb', line 33 def mju(&block) @connections.sum { _1.mju * block[_1.neuron] } end |
#mu ⇒ Object
The neuron’s mu is the sum of the connections’ mu(activation), plus one for the bias:
𝛍 := 1+∑𝐚'
21 22 23 24 25 |
# File 'lib/neuronet/neuron.rb', line 21 def mu return 0.0 if @connections.empty? 1 + @connections.sum(&:mu) end |
#partial ⇒ Object
For when connections are already updated, Neuron#partial updates the activation with the current values of bias and connections. It is not always necessary to burrow all the way down to the terminal input neuron to update the current neuron if it’s connected neurons have all been updated. The implementation should set it’s algorithm to use partial instead of update as update will most likely needlessly update previously updated neurons.
96 97 98 99 100 101 |
# File 'lib/neuronet/neuron.rb', line 96 def partial return @activation if @connections.empty? self.value = @bias + @connections.sum(&:partial) @activation end |
#to_s ⇒ Object
A neuron plainly puts itself as it’s label.
144 |
# File 'lib/neuronet/neuron.rb', line 144 def to_s = @label |
#update ⇒ Object
Updates the activation with the current value of bias and updated values of connections.
82 83 84 85 86 87 |
# File 'lib/neuronet/neuron.rb', line 82 def update return @activation if @connections.empty? self.value = @bias + @connections.sum(&:update) @activation end |
#value ⇒ Object
The “real world” value of the neuron is the unsquashed activation value.
69 |
# File 'lib/neuronet/neuron.rb', line 69 def value = Neuronet.unsquash[@activation] |
#value=(value) ⇒ Object
One can explicitly set the neuron’s value, typically used to set the input neurons. The given “real world” value is squashed into the neuron’s activation value.
60 61 62 63 64 65 66 |
# File 'lib/neuronet/neuron.rb', line 60 def value=(value) # If value is out of bounds, set it to the bound. if value.abs > Neuronet.maxv value = value.positive? ? Neuronet.maxv : -Neuronet.maxv end @activation = Neuronet.squash[value] end |