LeakyReLU
-
class torch.nn.LeakyReLU(negative_slope=0.01, inplace=False) [source]
-
Applies the LeakyReLU function element-wise.
or
- Parameters
-
-
negative_slope (float) – Controls the angle of the negative slope (which is used for negative input values). Default: 1e-2
-
inplace (bool) – can optionally do the operation in-place. Default:
False
- Shape:
-
- Input: where
* means, any number of additional dimensions - Output: , same shape as the input
Examples:
>>> m = nn.LeakyReLU(0.1)
>>> input = torch.randn(2)
>>> output = m(input)
-
Return the extra representation of the module.
- Return type
-
str
-
forward(input) [source]
-
Run forward pass.
- Return type
-
Tensor