| View source on GitHub |
Computes elementwise softplus: softplus(x) = log(exp(x) + 1).
tf.math.softplus(
features, name=None
)
softplus is a smooth approximation of relu. Like relu, softplus always takes on positive values.
import tensorflow as tf tf.math.softplus(tf.range(0, 2, dtype=tf.float32)).numpy() array([0.6931472, 1.3132616], dtype=float32)
| Args | |
|---|---|
features | Tensor |
name | Optional: name to associate with this operation. |
| Returns | |
|---|---|
Tensor |
© 2022 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 4.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.9/api_docs/python/tf/math/softplus