tf.nn.softmax_cross_entropy_with_logits_v2( _sentinel=None, labels=None, logits=None, dim=-1, name=None )
Defined in tensorflow/python/ops/nn_ops.py
.
See the guide: Neural Network > Classification
Computes softmax cross entropy between logits
and labels
.
Measures the probability error in discrete classification tasks in which the classes are mutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is labeled with one and only one label: an image can be a dog or a truck, but not both.
NOTE: While the classes are mutually exclusive, their probabilities need not be. All that is required is that each row of labels
is a valid probability distribution. If they are not, the computation of the gradient will be incorrect.
If using exclusive labels
(wherein one and only one class is true at a time), see sparse_softmax_cross_entropy_with_logits
.
WARNING: This op expects unscaled logits, since it performs a softmax
on logits
internally for efficiency. Do not call this op with the output of softmax
, as it will produce incorrect results.
logits
and labels
must have the same shape, e.g. [batch_size, num_classes]
and the same dtype (either float16
, float32
, or float64
).
Backpropagation will happen into both logits
and labels
. To disallow backpropagation into labels
, pass label tensors through tf.stop_gradient
before feeding it to this function.
Note that to avoid confusion, it is required to pass only named arguments to this function.
_sentinel
: Used to prevent positional parameters. Internal, do not use.labels
: Each row labels[i]
must be a valid probability distribution.logits
: Unscaled log probabilities.dim
: The class dimension. Defaulted to -1 which is the last dimension.name
: A name for the operation (optional).A 1-D Tensor
of length batch_size
of the same type as logits
with the softmax cross entropy loss.
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/nn/softmax_cross_entropy_with_logits_v2