Swish activation function, swish(x) = x * sigmoid(x)
.
tf.keras.activations.swish( x )
Swish activation function which returns x*sigmoid(x)
. It is a smooth, non-monotonic function that consistently matches or outperforms ReLU on deep networks, it is unbounded above and bounded below.
a = tf.constant([-20, -1.0, 0.0, 1.0, 20], dtype = tf.float32) b = tf.keras.activations.swish(a) b.numpy() array([-4.1223075e-08, -2.6894143e-01, 0.0000000e+00, 7.3105860e-01, 2.0000000e+01], dtype=float32)
Arguments | |
---|---|
x | Input tensor. |
Returns | |
---|---|
The swish activation applied to x (see reference paper for details). |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/keras/activations/swish