Rectified linear unit activation function.
tf.keras.ops.relu(
x
)
It is defined as f(x) = max(0, x).
| Args | |
|---|---|
x | Input tensor. |
| Returns | |
|---|---|
A tensor with the same shape as x. |
x1 = keras.ops.convert_to_tensor([-1.0, 0.0, 1.0, 0.2]) keras.ops.relu(x1) array([0.0, 0.0, 1.0, 0.2], dtype=float32)
© 2022 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 4.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/keras/ops/relu