HardSwishΒΆ

HardSwish takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the HardSwish function, y = x * max(0, min(1, alpha * x + beta)) = x * HardSigmoid<alpha, beta>(x), where alpha = 1/6 and beta = 0.5, is applied to the tensor elementwise.

Abstract Signature:

HardSwish(X: Tensor)

PyTorch

API: torch.nn.functional.hardswish
Strategy: Direct Mapping

Apple MLX

API: mlx.nn.layers.activations.hardswish
Strategy: Direct Mapping