LeakyReluΒΆ LeakyRelu takes input data (Tensor<T>) and an argument alpha, and produces one output data (Tensor<T>) where the function f(x) = alpha * x for x < 0, f(x) = x for x >= 0, is applied to the data tensor elementwise. Abstract Signature: LeakyRelu(X: Tensor, alpha: float) PyTorchKerasTensorFlowApple MLX PyTorchAPI: torch.nn.modules.activation.LeakyReLUStrategy: Direct MappingOfficial Docs βKerasAPI: keras.layers.LeakyReLUStrategy: Direct MappingOfficial Docs βTensorFlowAPI: keras.layers.LeakyReLUStrategy: Direct MappingOfficial Docs βApple MLXAPI: mlx.nn.layers.activations.LeakyReLUStrategy: Direct MappingOfficial Docs β