| View source on GitHub | 
Softsign activation function, softsign(x) = x / (abs(x) + 1).
tf.keras.activations.softsign(
    x
)
  a = tf.constant([-1.0, 0.0, 1.0], dtype = tf.float32) b = tf.keras.activations.softsign(a) b.numpy() array([-0.5, 0. , 0.5], dtype=float32)
| Args | |
|---|---|
| x | Input tensor. | 
| Returns | |
|---|---|
| The softsign activation: x / (abs(x) + 1). | 
    © 2022 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 4.0.
Code samples licensed under the Apache 2.0 License.
    https://www.tensorflow.org/versions/r2.9/api_docs/python/tf/keras/activations/softsign