CELU
-
class torch.nn.CELU(alpha=1.0, inplace=False) [source]
-
Applies the CELU function element-wise.
More details can be found in the paper Continuously Differentiable Exponential Linear Units .
- Parameters
-
-
alpha (float) – the value for the CELU formulation. Default: 1.0
-
inplace (bool) – can optionally do the operation in-place. Default:
False
- Shape:
-
- Input: , where means any number of dimensions.
- Output: , same shape as the input.
Examples:
>>> m = nn.CELU()
>>> input = torch.randn(2)
>>> output = m(input)
-
Return the extra representation of the module.
- Return type
-
str
-
forward(input) [source]
-
Runs the forward pass.
- Return type
-
Tensor