LeakyRelu ========= LeakyRelu takes input data (Tensor) and an argument alpha, and produces one output data (Tensor) where the function `f(x) = alpha * x for x < 0`, `f(x) = x for x >= 0`, is applied to the data tensor elementwise. **Abstract Signature:** ``LeakyRelu(X: Tensor, alpha: float)`` .. raw:: html

PyTorch

API: torch.nn.modules.activation.LeakyReLU
Strategy: Direct Mapping

Keras

API: keras.layers.LeakyReLU
Strategy: Direct Mapping

TensorFlow

API: keras.layers.LeakyReLU
Strategy: Direct Mapping

Apple MLX

API: mlx.nn.layers.activations.LeakyReLU
Strategy: Direct Mapping