tf.nn.softmax_cross_entropy_with_logits_v2( _sentinel=None, labels=None, logits=None, dim=-1, name=None )
See the guide: Neural Network > Classification
Computes softmax cross entropy between
Measures the probability error in discrete classification tasks in which the classes are mutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is labeled with one and only one label: an image can be a dog or a truck, but not both.
NOTE: While the classes are mutually exclusive, their probabilities need not be. All that is required is that each row of
labels is a valid probability distribution. If they are not, the computation of the gradient will be incorrect.
If using exclusive
labels (wherein one and only one class is true at a time), see
WARNING: This op expects unscaled logits, since it performs a
logits internally for efficiency. Do not call this op with the output of
softmax, as it will produce incorrect results.
labels must have the same shape, e.g.
[batch_size, num_classes] and the same dtype (either
Backpropagation will happen into both
labels. To disallow backpropagation into
labels, pass label tensors through
tf.stop_gradient before feeding it to this function.
Note that to avoid confusion, it is required to pass only named arguments to this function.
_sentinel: Used to prevent positional parameters. Internal, do not use.
labels: Each row
labels[i]must be a valid probability distribution.
logits: Unscaled log probabilities.
dim: The class dimension. Defaulted to -1 which is the last dimension.
name: A name for the operation (optional).
Tensor of length
batch_size of the same type as
logits with the softmax cross entropy loss.
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.