ELU === Exponential Linear Unit activation. **Abstract Signature:** ``ELU(alpha: float = 1.0)`` .. raw:: html

PyTorch

API: torch.nn.ELU
Strategy: Direct Mapping

Keras

API: keras.layers.ELU
Strategy: Direct Mapping

TensorFlow

API: tf.keras.layers.ELU
Strategy: Direct Mapping

Apple MLX

API: mlx.nn.ELU
Strategy: Direct Mapping

Flax NNX

API: flax.nnx.ELU
Strategy: Direct Mapping

PaxML / Praxis

API: praxis.layers.ELU
Strategy: Direct Mapping