W3cubDocs

/TensorFlow Python

tf.nn.softmax

tf.nn.softmax(
    logits,
    axis=None,
    name=None,
    dim=None
)

Defined in tensorflow/python/ops/nn_ops.py.

See the guides: Layers (contrib) > Higher level ops for building neural network layers, Neural Network > Classification

Computes softmax activations. (deprecated arguments)

SOME ARGUMENTS ARE DEPRECATED. They will be removed in a future version. Instructions for updating: dim is deprecated, use axis instead

This function performs the equivalent of

softmax = tf.exp(logits) / tf.reduce_sum(tf.exp(logits), axis)

Args:

  • logits: A non-empty Tensor. Must be one of the following types: half, float32, float64.
  • axis: The dimension softmax would be performed on. The default is -1 which indicates the last dimension.
  • name: A name for the operation (optional).
  • dim: Deprecated alias for axis.

Returns:

A Tensor. Has the same type and shape as logits.

Raises:

  • InvalidArgumentError: if logits is empty or axis is beyond the last dimension of logits.

© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/nn/softmax