PRelu ===== PRelu takes input data (Tensor) and slope tensor as input, and produces one output data (Tensor) where the function `f(x) = slope * x for x < 0`, `f(x) = x for x >= 0`., is applied to the data tensor elementwise. This operator supports **unidirectional broadcasting** (tensor slope should be un... **Abstract Signature:** ``PRelu(X: Tensor, slope: Tensor)`` .. raw:: html

PyTorch

API: torch.nn.modules.activation.PReLU
Strategy: Direct Mapping

Keras

API: keras.layers.PReLU
Strategy: Direct Mapping

TensorFlow

API: keras.layers.PReLU
Strategy: Direct Mapping

Apple MLX

API: mlx.nn.layers.activations.PReLU
Strategy: Direct Mapping

Flax NNX

API: nnx.nn.activations.PReLU
Strategy: Direct Mapping