ReLU2ΒΆ
Applies the ReLU activation capped at 2, or squared ReLU depending on framework interpretation.
PyTorch
API:
βStrategy: Custom / Partial
Applies the ReLU activation capped at 2, or squared ReLU depending on framework interpretation.
β