LeakyReluΒΆ

LeakyRelu takes input data (Tensor<T>) and an argument alpha, and produces one output data (Tensor<T>) where the function f(x) = alpha * x for x < 0, f(x) = x for x >= 0, is applied to the data tensor elementwise.

Abstract Signature:

LeakyRelu(X: Tensor, alpha: float)

PyTorch

API: torch.nn.modules.activation.LeakyReLU
Strategy: Direct Mapping

Keras

API: keras.layers.LeakyReLU
Strategy: Direct Mapping

TensorFlow

API: keras.layers.LeakyReLU
Strategy: Direct Mapping

Apple MLX

API: mlx.nn.layers.activations.LeakyReLU
Strategy: Direct Mapping