Scaled Exponential Linear Unit (SELU) activation function.
tf.keras.ops.selu(
x
)
f(x) = scale * alpha * (exp(x) - 1.) for x < 0, f(x) = scale * x for x >= 0.
| Args | |
|---|---|
x | Input tensor. |
| Returns | |
|---|---|
A tensor with the same shape as x. |
x = np.array([-1., 0., 1.]) x_selu = keras.ops.selu(x) print(x_selu) array([-1.11133055, 0., 1.05070098], shape=(3,), dtype=float64)
© 2022 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 4.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/keras/ops/selu