Computes Concatenated ReLU.
tf.compat.v2.nn.crelu( features, axis=-1, name=None )
Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units. W. Shang, et al.
| || A |
| ||A name for the operation (optional).|
| ||The axis that the output values are concatenated along. Default is -1.|
| A |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.