class torch.nn.RNNCell(input_size: int, hidden_size: int, bias: bool = True, nonlinearity: str = 'tanh') [source]
An Elman RNN cell with tanh or ReLU non-linearity.
If nonlinearity is ‘relu’, then ReLU is used in place of tanh.
x
h
False, then the layer does not use bias weights b_ih and b_hh. Default: True
'tanh' or 'relu'. Default: 'tanh'
(batch, input_size): tensor containing input features(batch, hidden_size): tensor containing the initial hidden state for each element in the batch. Defaults to zero if not provided.(batch, hidden_size): tensor containing the next hidden state for each element in the batchinput_size
hidden_size Defaults to zero if not provided.(hidden_size, input_size)
(hidden_size, hidden_size)
(hidden_size)
(hidden_size)
Note
All the weights and biases are initialized from where
Examples:
>>> rnn = nn.RNNCell(10, 20)
>>> input = torch.randn(6, 3, 10)
>>> hx = torch.randn(3, 20)
>>> output = []
>>> for i in range(6):
hx = rnn(input[i], hx)
output.append(hx)
© 2019 Torch Contributors
Licensed under the 3-clause BSD License.
https://pytorch.org/docs/1.7.0/generated/torch.nn.RNNCell.html