|View source on GitHub|
Layer normalization layer (Ba et al., 2016).
Compat aliases for migration
See Migration guide for more details.
tf.keras.layers.LayerNormalization( axis=-1, epsilon=0.001, center=True, scale=True, beta_initializer='zeros', gamma_initializer='ones', beta_regularizer=None, gamma_regularizer=None, beta_constraint=None, gamma_constraint=None, trainable=True, name=None, **kwargs )
Normalize the activations of the previous layer for each given example in a batch independently, rather than across a batch like Batch Normalization. i.e. applies a transformation that maintains the mean activation within each example close to 0 and the activation standard deviation close to 1.
| ||Integer or List/Tuple. The axis that should be normalized (typically the features axis).|
| ||Small float added to variance to avoid dividing by zero.|
| || If True, add offset of |
| || If True, multiply by |
| ||Initializer for the beta weight.|
| ||Initializer for the gamma weight.|
| ||Optional regularizer for the beta weight.|
| ||Optional regularizer for the gamma weight.|
| ||Optional constraint for the beta weight.|
| ||Optional constraint for the gamma weight.|
| || Boolean, if |
Arbitrary. Use the keyword argument
input_shape (tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.
Same shape as input.
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.