Lecun normal initializer.
Inherits From: VarianceScaling
tf.keras.initializers.LecunNormal( seed=None )
Also available via the shortcut function tf.keras.initializers.lecun_normal
.
Initializers allow you to pre-specify an initialization strategy, encoded in the Initializer object, without knowing the shape and dtype of the variable being initialized.
Draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(1 / fan_in)
where fan_in
is the number of input units in the weight tensor.
# Standalone usage: initializer = tf.keras.initializers.LecunNormal() values = initializer(shape=(2, 2))
# Usage in a Keras layer: initializer = tf.keras.initializers.LecunNormal() layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)
Arguments | |
---|---|
seed | A Python integer. Used to seed the random generator. |
from_config
@classmethod from_config( config )
Instantiates an initializer from a configuration dictionary.
initializer = RandomUniform(-1, 1) config = initializer.get_config() initializer = RandomUniform.from_config(config)
Args | |
---|---|
config | A Python dictionary. It will typically be the output of get_config . |
Returns | |
---|---|
An Initializer instance. |
get_config
get_config()
Returns the configuration of the initializer as a JSON-serializable dict.
Returns | |
---|---|
A JSON-serializable Python dict. |
__call__
__call__( shape, dtype=None )
Returns a tensor object initialized as specified by the initializer.
Args | |
---|---|
shape | Shape of the tensor. |
dtype | Optional dtype of the tensor. Only floating point types are supported. If not specified, tf.keras.backend.floatx() is used, which default to float32 unless you configured it otherwise (via tf.keras.backend.set_floatx(float_dtype) ) |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/keras/initializers/LecunNormal