tf.nn.leaky_relu(
features,
alpha=0.2,
name=None
)
Defined in tensorflow/python/ops/nn_ops.py.
Compute the Leaky ReLU activation function.
"Rectifier Nonlinearities Improve Neural Network Acoustic Models" AL Maas, AY Hannun, AY Ng - Proc. ICML, 2013 http://web.stanford.edu/~awni/papers/relu_hybrid_icml2013_final.pdf
features: A Tensor representing preactivation values. Must be one of the following types: float16, float32, float64, int32, int64.alpha: Slope of the activation function at x < 0.name: A name for the operation (optional).The activation value.
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/nn/leaky_relu