class torch.nn.RReLU(lower: float = 0.125, upper: float = 0.3333333333333333, inplace: bool = False)
[source]
Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:
Empirical Evaluation of Rectified Activations in Convolutional Network.
The function is defined as:
where is randomly sampled from uniform distribution .
See: https://arxiv.org/pdf/1505.00853.pdf
False
*
means, any number of additional dimensionsExamples:
>>> m = nn.RReLU(0.1, 0.3) >>> input = torch.randn(2) >>> output = m(input)
© 2019 Torch Contributors
Licensed under the 3-clause BSD License.
https://pytorch.org/docs/1.7.0/generated/torch.nn.RReLU.html