ReLUΒΆ

Rectified Linear Unit activation layer.

PyTorch

API: torch.nn.ReLU
Strategy: Direct Mapping

JAX (Core)

API: jax.nn.relu
Strategy: Direct Mapping

Keras

API: keras.layers.ReLU
Strategy: Direct Mapping

TensorFlow

API: tf.keras.layers.ReLU
Strategy: Direct Mapping

Apple MLX

API: mlx.nn.ReLU
Strategy: Direct Mapping

LaTeX DSL (MIDL)

API: midl.ReLU
Strategy: Direct Mapping

Flax NNX

API: nnx.relu
Strategy: Direct Mapping

PaxML / Praxis

API: praxis.layers.ReLU
Strategy: Direct Mapping