tf.keras.activations.selu(x)
Defined in tensorflow/python/keras/_impl/keras/activations.py
.
Scaled Exponential Linear Unit. (Klambauer et al., 2017).
x
: A tensor or variable to compute the activation function for.Tensor with the same shape and dtype as `x`.
- To be used together with the initialization "lecun_normal". - To be used together with the dropout variant "AlphaDropout".
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/keras/activations/selu