RNN¶

Computes an one-layer simple RNN. This operator is usually supported via some custom implementation such as CuDNN. Notations: * X - input tensor * i - input gate * t - time step (t-1 means previous time step) * Wi - W parameter weight matrix for input gate * Ri - R recurrence weight matrix…

Abstract Signature:

RNN(X: Tensor, W: Tensor, R: Tensor, B: Tensor, sequence_lens, initial_h: Tensor, activation_alpha: List[float], activation_beta: List[float], activations: List[str], clip: float, direction: str, hidden_size: int, layout: int)

PyTorch

API: torch.nn.modules.rnn.RNN
Strategy: Direct Mapping

Keras

API: keras.layers.RNN
Strategy: Direct Mapping

TensorFlow

API: keras.layers.RNN
Strategy: Direct Mapping

Apple MLX

API: mlx.nn.layers.recurrent.RNN
Strategy: Direct Mapping

Flax NNX

API: nnx.nn.recurrent.RNN
Strategy: Direct Mapping