PReLU
Applies the element-wise Parametric ReLU.
Abstract Signature:
PReLU(num_parameters: int = 1, init: float = 0.25)
PyTorch
API: torch.nn.PReLU
Strategy: Direct Mapping
Keras
API: keras.layers.PReLU
Strategy: Direct Mapping
TensorFlow
API: tf.keras.layers.PReLU
Strategy: Direct Mapping
Apple MLX
API: mlx.nn.PReLU
Strategy: Direct Mapping
Flax NNX
API: flax.nnx.PReLU
Strategy: Direct Mapping
PaxML / Praxis
API: praxis.layers.PReLU
Strategy: Direct Mapping