GELU
Applies the Gaussian Error Linear Units function.
Abstract Signature:
GELU(approximate: str = none)
PyTorch
API: torch.nn.GELU
Strategy: Direct Mapping
JAX (Core)
API: jax.nn.gelu
Strategy: Direct Mapping
Keras
API: keras.layers.Activation
Strategy: Direct Mapping
TensorFlow
API: tf.keras.layers.GELU
Strategy: Direct Mapping
Apple MLX
API: mlx.nn.GELU
Strategy: Direct Mapping
Flax NNX
API: flax.nnx.gelu
Strategy: Direct Mapping
PaxML / Praxis
API: paxml.layers.GELU
Strategy: Direct Mapping