ReLU2 ===== Applies the ReLU activation capped at 2, or squared ReLU depending on framework interpretation. .. raw:: html

PyTorch

API:
Strategy: Custom / Partial

Apple MLX

API: mlx.nn.ReLU2
Strategy: Direct Mapping