RNN¶
Computes an one-layer simple RNN. This operator is usually supported via some custom implementation such as CuDNN. Notations: * X - input tensor * i - input gate * t - time step (t-1 means previous time step) * Wi - W parameter weight matrix for input gate * Ri - R recurrence weight matrix…
Abstract Signature:
RNN(X: Tensor, W: Tensor, R: Tensor, B: Tensor, sequence_lens, initial_h: Tensor, activation_alpha: List[float], activation_beta: List[float], activations: List[str], clip: float, direction: str, hidden_size: int, layout: int)