Class: Transformers::DebertaV2::DebertaV2Layer
- Inherits:
-
Torch::NN::Module
- Object
- Torch::NN::Module
- Transformers::DebertaV2::DebertaV2Layer
- Defined in:
- lib/transformers/models/deberta_v2/modeling_deberta_v2.rb
Instance Method Summary collapse
- #forward(hidden_states, attention_mask, query_states: nil, relative_pos: nil, rel_embeddings: nil, output_attentions: false) ⇒ Object
-
#initialize(config) ⇒ DebertaV2Layer
constructor
A new instance of DebertaV2Layer.
Constructor Details
#initialize(config) ⇒ DebertaV2Layer
Returns a new instance of DebertaV2Layer.
233 234 235 236 237 238 |
# File 'lib/transformers/models/deberta_v2/modeling_deberta_v2.rb', line 233 def initialize(config) super() @attention = DebertaV2Attention.new(config) @intermediate = DebertaV2Intermediate.new(config) @output = DebertaV2Output.new(config) end |
Instance Method Details
#forward(hidden_states, attention_mask, query_states: nil, relative_pos: nil, rel_embeddings: nil, output_attentions: false) ⇒ Object
240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 |
# File 'lib/transformers/models/deberta_v2/modeling_deberta_v2.rb', line 240 def forward( hidden_states, attention_mask, query_states: nil, relative_pos: nil, rel_embeddings: nil, output_attentions: false ) attention_output = @attention.(hidden_states, attention_mask, output_attentions: output_attentions, query_states: query_states, relative_pos: relative_pos, rel_embeddings: ) if output_attentions attention_output, att_matrix = attention_output end intermediate_output = @intermediate.(attention_output) layer_output = @output.(intermediate_output, attention_output) if output_attentions [layer_output, att_matrix] else layer_output end end |