ReLU6 ===== ReLU6 activation layer. .. raw:: html

PyTorch

API: torch.nn.ReLU6
Strategy: Direct Mapping

JAX (Core)

API: jax.nn.relu6
Strategy: Direct Mapping

Keras

API: keras.layers.ReLU
Strategy: Direct Mapping

TensorFlow

API: tf.keras.layers.ReLU
Strategy: Direct Mapping

Apple MLX

API: mlx.nn.ReLU6
Strategy: Direct Mapping

Flax NNX

API: nnx.relu6
Strategy: Direct Mapping

PaxML / Praxis

API: praxis.layers.ReLU6
Strategy: Direct Mapping