Class: Torch::NN::LayerNorm
Instance Attribute Summary
Attributes inherited from Module
Instance Method Summary collapse
- #extra_inspect ⇒ Object
- #forward(input) ⇒ Object
-
#initialize(normalized_shape, eps: 1e-5, elementwise_affine: true) ⇒ LayerNorm
constructor
A new instance of LayerNorm.
- #reset_parameters ⇒ Object
Methods inherited from Module
#_apply, #add_module, #apply, #buffers, #call, #children, #cpu, #cuda, #deep_dup, #double, #eval, #float, #half, #inspect, #load_state_dict, #method_missing, #modules, #named_buffers, #named_children, #named_modules, #named_parameters, #parameters, #register_buffer, #register_parameter, #requires_grad!, #respond_to?, #share_memory, #state_dict, #to, #train, #type, #zero_grad
Methods included from Utils
#_activation_fn, #_clones, #_ntuple, #_pair, #_quadrupal, #_single, #_triple
Constructor Details
#initialize(normalized_shape, eps: 1e-5, elementwise_affine: true) ⇒ LayerNorm
Returns a new instance of LayerNorm.
4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
# File 'lib/torch/nn/layer_norm.rb', line 4 def initialize(normalized_shape, eps: 1e-5, elementwise_affine: true) super() @normalized_shape = Array(normalized_shape) @eps = eps @elementwise_affine = elementwise_affine if @elementwise_affine @weight = Parameter.new(Torch::Tensor.new(*normalized_shape)) @bias = Parameter.new(Torch::Tensor.new(*normalized_shape)) else register_parameter("weight", nil) register_parameter("bias", nil) end reset_parameters end |
Dynamic Method Handling
This class handles dynamic methods through the method_missing method in the class Torch::NN::Module
Instance Method Details
#extra_inspect ⇒ Object
30 31 32 |
# File 'lib/torch/nn/layer_norm.rb', line 30 def extra_inspect format("%{normalized_shape}, eps: %{eps}, elementwise_affine: %{elementwise_affine}", **dict) end |
#forward(input) ⇒ Object
26 27 28 |
# File 'lib/torch/nn/layer_norm.rb', line 26 def forward(input) F.layer_norm(input, @normalized_shape, weight: @weight, bias: @bias, eps: @eps) end |