View source on GitHub |
Leaky version of a Rectified Linear Unit.
Inherits From: Layer
tf.keras.layers.LeakyReLU( alpha=0.3, **kwargs )
It allows a small gradient when the unit is not active:
f(x) = alpha * x if x < 0 f(x) = x if x >= 0
layer = tf.keras.layers.LeakyReLU() output = layer([-3.0, -1.0, 0.0, 2.0]) list(output.numpy()) [-0.9, -0.3, 0.0, 2.0] layer = tf.keras.layers.LeakyReLU(alpha=0.1) output = layer([-3.0, -1.0, 0.0, 2.0]) list(output.numpy()) [-0.3, -0.1, 0.0, 2.0]
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the batch axis) when using this layer as the first layer in a model.
Same shape as the input.
Arguments | |
---|---|
alpha | Float >= 0. Negative slope coefficient. Default to 0.3. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/keras/layers/LeakyReLU