SiLU
Sigmoid Linear Unit activation layer.
PyTorch
API: torch.nn.SiLU
Strategy: Direct Mapping
JAX (Core)
API: jax.nn.silu
Strategy: Direct Mapping
NumPy
API: β
Strategy: Macro '{x} * (1 / (1 + np.exp(-{x})))'
Keras
API: keras.layers.Activation
Strategy: Direct Mapping
TensorFlow
API: tf.nn.silu
Strategy: Direct Mapping
Apple MLX
API: mlx.nn.SiLU
Strategy: Direct Mapping
Flax NNX
API: nnx.silu
Strategy: Direct Mapping
PaxML / Praxis
API: praxis.layers.SiLU
Strategy: Direct Mapping