W3cubDocs

/TensorFlow 1.15

tf.nn.leaky_relu

View source on GitHub

Compute the Leaky ReLU activation function.

Source: Rectifier Nonlinearities Improve Neural Network Acoustic Models. AL Maas, AY Hannun, AY Ng - Proc. ICML, 2013.

Args
features A Tensor representing preactivation values. Must be one of the following types: float16, float32, float64, int32, int64.
alpha Slope of the activation function at x < 0.
name A name for the operation (optional).
Returns
The activation value.

© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/nn/leaky_relu