activation_relu_or_geluΒΆ

Auto-generated from mlx_code_defs

PyTorch

API: torch.nn.modules.transformer.TransformerEncoderLayer.activation_relu_or_gelu
Strategy: Direct Mapping

Apple MLX

API: mlx.nn.layers.transformer.TransformerEncoderLayer.activation
Strategy: Direct Mapping