mindspore.nn.RNNCell

class mindspore.nn.RNNCell(input_size: int, hidden_size: int, has_bias: bool = True, nonlinearity: str = 'tanh', dtype=mstype.float32)[source]

An Elman RNN cell with tanh or ReLU non-linearity.

ht=tanh(Wihxt+bih+Whhh(t1)+bhh)

Here ht is the hidden state at time t, xt is the input at time t, and h(t1) is the hidden state of the previous layer at time t1 or the initial hidden state at time 0. If nonlinearity is relu, then relu is used instead of tanh.

Parameters
  • input_size (int) – Number of features of input.

  • hidden_size (int) – Number of features of hidden layer.

  • has_bias (bool) – Whether the cell has bias bih and bhh. Default: True .

  • nonlinearity (str) – The non-linearity to use. Can be either "tanh" or "relu" . Default: "tanh" .

  • dtype (mindspore.dtype) – Dtype of Parameters. Default: mstype.float32 .

Inputs:
  • x (Tensor) - Tensor of shape (batch_size,input_size) .

  • hx (Tensor) - Tensor of data type mindspore.float32 and shape (batch_size,hidden_size) .

Outputs:
  • hx' (Tensor) - Tensor of shape (batch_size,hidden_size) .

Raises
  • TypeError – If input_size or hidden_size is not an int or not greater than 0.

  • TypeError – If has_bias is not a bool.

  • ValueError – If nonlinearity is not in ['tanh', 'relu'].

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore as ms
>>> import numpy as np
>>> net = ms.nn.RNNCell(10, 16)
>>> x = ms.Tensor(np.ones([5, 3, 10]).astype(np.float32))
>>> hx = ms.Tensor(np.ones([3, 16]).astype(np.float32))
>>> output = []
>>> for i in range(5):
...     hx = net(x[i], hx)
...     output.append(hx)
>>> print(output[0].shape)
(3, 16)