tf.nn.crelu(
features,
name=None,
axis=-1
)
Defined in tensorflow/python/ops/nn_ops.py.
See the guide: Neural Network > Activation Functions
Computes Concatenated ReLU.
Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units. W. Shang, et al.
features: A Tensor with type float, double, int32, int64, uint8, int16, or int8.name: A name for the operation (optional).axis: The axis that the output values are concatenated along. Default is -1.A Tensor with the same type as features.
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/nn/crelu