HardSwish ========= HardSwish takes one input data (Tensor) and produces one output data (Tensor) where the HardSwish function, y = x * max(0, min(1, alpha * x + beta)) = x * HardSigmoid(x), where alpha = 1/6 and beta = 0.5, is applied to the tensor elementwise. **Abstract Signature:** ``HardSwish(X: Tensor)`` .. raw:: html

PyTorch

API: torch.nn.functional.hardswish
Strategy: Direct Mapping

Apple MLX

API: mlx.nn.layers.activations.hardswish
Strategy: Direct Mapping