TransformerΒΆ
Transformer layer with multi-headed attention.
Abstract Signature:
Transformer(input_dims: int, hidden_dims: int, num_heads: int, dim_per_head: int)
Transformer layer with multi-headed attention.
Abstract Signature:
Transformer(input_dims: int, hidden_dims: int, num_heads: int, dim_per_head: int)