CubedReLU ========= Activation: max(0, x)^3. .. raw:: html

PyTorch

API:
Strategy: Macro 'lambda x: torch.relu(x)**3'

PaxML / Praxis

API: praxis.layers.CubedReLU
Strategy: Direct Mapping