ReLU2ΒΆ

Applies the ReLU activation capped at 2, or squared ReLU depending on framework interpretation.

PyTorch

API: β€”
Strategy: Custom / Partial

Apple MLX

API: mlx.nn.ReLU2
Strategy: Direct Mapping