Module: DNN::Layers::LayerNode
- Included in:
- Add, AvgPool2D, BatchNormalization, Concatenate, Conv2D, Conv2DTranspose, Dense, Div, Dot, Dropout, ELU, Embedding, Exp, Lasso, LeakyReLU, Log, MaxPool2D, Mean, Mish, Mul, Neg, Pow, RNN, ReLU, Reshape, Ridge, Sigmoid, Softplus, Softsign, Split, Sqrt, Sub, Sum, Swish, Tanh, UnPool2D, DNN::Losses::Hinge, DNN::Losses::HuberLoss, DNN::Losses::MeanAbsoluteError, DNN::Losses::MeanSquaredError, DNN::Losses::SigmoidCrossEntropy, DNN::Losses::SoftmaxCrossEntropy
- Defined in:
- lib/dnn/core/layers/basic_layers.rb
Instance Method Summary collapse
Instance Method Details
#backward_node(*dys) ⇒ Object
19 20 21 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 19 def backward_node(*dys) raise NotImplementedError, "Class '#{self.class.name}' has implement method 'backward_node'" end |
#forward(*inputs) ⇒ Object
5 6 7 8 9 10 11 12 13 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 5 def forward(*inputs) xs = inputs.map(&:data) prevs = inputs.map { |input| input.is_a?(Tensor) ? input.link : input } ys = forward_node(*xs) num_outputs = (ys.is_a?(Array) ? ys.length : 1) link = Link.new(prevs: prevs, layer_node: self, num_outputs: num_outputs) prevs.map { |prev| prev.next = link if prev.is_a?(Link) } Tensor.convert(ys, link) end |
#forward_node(*xs) ⇒ Object
15 16 17 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 15 def forward_node(*xs) raise NotImplementedError, "Class '#{self.class.name}' has implement method 'forward_node'" end |