Class: Neuronet::Neuron

Inherits:
Object
  • Object
show all
Defined in:
lib/neuronet/neuron.rb

Overview

A Neuron is capable of creating connections to other neurons. The connections attribute is a list of the neuron’s connections to other neurons. A neuron’s bias is it’s kicker (or deduction) to it’s activation value, a sum of its connections values.

Class Attribute Summary collapse

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(value = 0.0, bias: 0.0, connections: []) ⇒ Neuron

The initialize method sets the neuron’s value, bias and connections.



72
73
74
75
76
77
78
# File 'lib/neuronet/neuron.rb', line 72

def initialize(value = 0.0, bias: 0.0, connections: [])
  self.value   = value
  @connections = connections
  @bias        = bias
  @label       = Neuron.label
  Neuron.label = Neuron.label.next
end

Class Attribute Details

.labelObject

Returns the value of attribute label.



12
13
14
# File 'lib/neuronet/neuron.rb', line 12

def label
  @label
end

Instance Attribute Details

#activationObject (readonly)

Returns the value of attribute activation.



15
16
17
# File 'lib/neuronet/neuron.rb', line 15

def activation
  @activation
end

#biasObject

Returns the value of attribute bias.



16
17
18
# File 'lib/neuronet/neuron.rb', line 16

def bias
  @bias
end

#connectionsObject (readonly)

Returns the value of attribute connections.



15
16
17
# File 'lib/neuronet/neuron.rb', line 15

def connections
  @connections
end

#labelObject (readonly)

Returns the value of attribute label.



15
16
17
# File 'lib/neuronet/neuron.rb', line 15

def label
  @label
end

Class Method Details

.mju(neuron) ⇒ Object

Full recursive implementation of mju:



38
39
40
41
42
# File 'lib/neuronet/neuron.rb', line 38

def self.mju(neuron)
  return 0.0 if neuron.connections.empty?

  neuron.mu + neuron.mju { |connected_neuron| Neuron.mju(connected_neuron) }
end

Instance Method Details

#backpropagate(error) ⇒ Object

The backpropagate method modifies the neuron’s bias in proportion to the given error and passes on this error to each of its connection’s backpropagate method. While updates flows from input to output, back- propagation of errors flows from output to input.



107
108
109
110
111
112
113
114
115
116
# File 'lib/neuronet/neuron.rb', line 107

def backpropagate(error)
  return self if @connections.empty?

  @bias += Neuronet.noise[error]
  if @bias.abs > Neuronet.maxb
    @bias = @bias.positive? ? Neuronet.maxb : -Neuronet.maxb
  end
  @connections.each { |connection| connection.backpropagate(error) }
  self
end

#connect(neuron = Neuron.new, weight: 0.0) ⇒ Object

Connects the neuron to another neuron. The default weight=0 means there is no initial association. The connect method is how the implementation adds a connection, the way to connect a neuron to another. To connect “output” to “input”, for example, it is:

input = Neuronet::Neuron.new
output = Neuronet::Neuron.new
output.connect(input)

Think “output” connects to “input”.



126
127
128
129
130
# File 'lib/neuronet/neuron.rb', line 126

def connect(neuron = Neuron.new, weight: 0.0)
  @connections.push(Connection.new(neuron, weight:))
  # Note that we're returning the connected neuron:
  neuron
end

#derivativeObject

𝓓𝒗⌈𝒗 = (1-⌈𝒗)⌈𝒗 = (1-𝒂)𝒂 = 𝓑𝒂



45
# File 'lib/neuronet/neuron.rb', line 45

def derivative = Neuronet.derivative[@activation]

#inspectObject

Tacks on to neuron’s inspect method to show the neuron’s bias and connections.



134
135
136
137
138
139
140
141
# File 'lib/neuronet/neuron.rb', line 134

def inspect
  fmt = Neuronet.format
  if @connections.empty?
    "#{@label}:#{fmt % value}"
  else
    "#{@label}:#{fmt % value}|#{[(fmt % @bias), *@connections].join('+')}"
  end
end

#iotaObject

𝜾 := 𝜧 𝜧‘ 𝝁“ = 𝜧 𝜿’



55
# File 'lib/neuronet/neuron.rb', line 55

def iota = mju(&:kappa)

#kappaObject

𝜿 := 𝜧 𝝁‘ = 𝑾 𝓑𝒂’𝝁‘ = 𝑾 𝝀’ def kappa = mju(&:mu)



52
# File 'lib/neuronet/neuron.rb', line 52

def kappa = @connections.sum(&:kappa)

#lamdaObject

𝝀 = 𝓑𝒂𝛍



48
# File 'lib/neuronet/neuron.rb', line 48

def lamda = derivative * mu

#mju(&block) ⇒ Object

Reference the library’s wiki:

𝒆ₕ ~ 𝜀(𝝁ₕ + 𝜧ₕⁱ𝝁ᵢ + 𝜧ₕⁱ𝜧ᵢʲ𝝁ⱼ + 𝜧ₕⁱ𝜧ᵢʲ𝜧ⱼᵏ𝝁ₖ + ...)

𝜧ₕⁱ𝝁ᵢ is:

neuron.mju{ |connected_neuron| connected_neuron.mu }

𝜧ₕⁱ𝜧ᵢʲ𝝁ⱼ is:

nh.mju{ |ni| ni.mju{ |nj| nj.mu }}


33
34
35
# File 'lib/neuronet/neuron.rb', line 33

def mju(&block)
  @connections.sum { _1.mju * block[_1.neuron] }
end

#muObject

The neuron’s mu is the sum of the connections’ mu(activation), plus one for the bias:

𝛍 := 1+∑𝐚'


21
22
23
24
25
# File 'lib/neuronet/neuron.rb', line 21

def mu
  return 0.0 if @connections.empty?

  1 + @connections.sum(&:mu)
end

#partialObject

For when connections are already updated, Neuron#partial updates the activation with the current values of bias and connections. It is not always necessary to burrow all the way down to the terminal input neuron to update the current neuron if it’s connected neurons have all been updated. The implementation should set it’s algorithm to use partial instead of update as update will most likely needlessly update previously updated neurons.



96
97
98
99
100
101
# File 'lib/neuronet/neuron.rb', line 96

def partial
  return @activation if @connections.empty?

  self.value = @bias + @connections.sum(&:partial)
  @activation
end

#to_sObject

A neuron plainly puts itself as it’s label.



144
# File 'lib/neuronet/neuron.rb', line 144

def to_s = @label

#updateObject

Updates the activation with the current value of bias and updated values of connections.



82
83
84
85
86
87
# File 'lib/neuronet/neuron.rb', line 82

def update
  return @activation if @connections.empty?

  self.value = @bias + @connections.sum(&:update)
  @activation
end

#valueObject

The “real world” value of the neuron is the unsquashed activation value.



69
# File 'lib/neuronet/neuron.rb', line 69

def value = Neuronet.unsquash[@activation]

#value=(value) ⇒ Object

One can explicitly set the neuron’s value, typically used to set the input neurons. The given “real world” value is squashed into the neuron’s activation value.



60
61
62
63
64
65
66
# File 'lib/neuronet/neuron.rb', line 60

def value=(value)
  # If value is out of bounds, set it to the bound.
  if value.abs > Neuronet.maxv
    value = value.positive? ? Neuronet.maxv : -Neuronet.maxv
  end
  @activation = Neuronet.squash[value]
end