AttentionLayer ============== Dot-product attention layer, a.k.a. Luong-style attention. **Abstract Signature:** ``AttentionLayer(use_scale: bool = False, score_mode: str = dot, dropout: float = 0.0, seed: int)`` .. raw:: html
torch.nn.MultiheadAttention
—
keras.layers.Attention
tf.keras.layers.Attention
flax.linen.attention.dot_product_attention