W3cubDocs

/PyTorch

Softmin

class torch.nn.Softmin(dim: Optional[int] = None) [source]

Applies the Softmin function to an n-dimensional input Tensor rescaling them so that the elements of the n-dimensional output Tensor lie in the range [0, 1] and sum to 1.

Softmin is defined as:

Softmin(xi)=exp(xi)jexp(xj)\text{Softmin}(x_{i}) = \frac{\exp(-x_i)}{\sum_j \exp(-x_j)}
Shape:
  • Input: ()(*) where * means, any number of additional dimensions
  • Output: ()(*) , same shape as the input
Parameters

dim (int) – A dimension along which Softmin will be computed (so every slice along dim will sum to 1).

Returns

a Tensor of the same dimension and shape as the input, with values in the range [0, 1]

Examples:

>>> m = nn.Softmin()
>>> input = torch.randn(2, 3)
>>> output = m(input)

© 2019 Torch Contributors
Licensed under the 3-clause BSD License.
https://pytorch.org/docs/1.7.0/generated/torch.nn.Softmin.html