Hard SiLU activation function, also known as Hard Swish.
tf.keras.ops.hard_silu(
x
)
0 if if x < -3
x if x > 3
x * (x + 3) / 6 if -3 <= x <= 3
It's a faster, piecewise linear approximation of the silu activation.
| Args | |
|---|---|
x | Input tensor. |
| Returns | |
|---|---|
A tensor with the same shape as x. |
x = keras.ops.convert_to_tensor([-3.0, -1.0, 0.0, 1.0, 3.0]) keras.ops.hard_silu(x) array([-0.0, -0.3333333, 0.0, 0.6666667, 3.0], shape=(5,), dtype=float32)
© 2022 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 4.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/keras/ops/hard_silu