Creates a bidirectional recurrent neural network. (deprecated)
tf.compat.v1.nn.static_bidirectional_rnn( cell_fw, cell_bw, inputs, initial_state_fw=None, initial_state_bw=None, dtype=None, sequence_length=None, scope=None )
Similar to the unidirectional case above (rnn) but takes input and builds independent forward and backward RNNs with the final forward and backward outputs depth-concatenated, such that the output will have the format [time][batch][cell_fw.output_size + cell_bw.output_size]. The input_size of forward and backward cell must match. The initial state for both directions is zero by default (but can be set optionally) and no intermediate states are ever returned -- the network is fully unrolled for the given (passed in) length(s) of the sequence(s) or completely unrolled if length(s) is not given.
| ||An instance of RNNCell, to be used for forward direction.|
| ||An instance of RNNCell, to be used for backward direction.|
| ||A length T list of inputs, each a tensor of shape [batch_size, input_size], or a nested tuple of such elements.|
| || (optional) An initial state for the forward RNN. This must be a tensor of appropriate type and shape |
| || (optional) Same as for |
| ||(optional) The data type for the initial state. Required if either of the initial states are not provided.|
| || (optional) An int32/int64 vector, size |
| ||VariableScope for the created subgraph; defaults to "bidirectional_rnn"|
| A tuple (outputs, output_state_fw, output_state_bw) where: outputs is a length |
| || If |
| ||If inputs is None or an empty list.|
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.