tf.losses.softmax_cross_entropy( onehot_labels, logits, weights=1.0, label_smoothing=0, scope=None, loss_collection=tf.GraphKeys.LOSSES, reduction=Reduction.SUM_BY_NONZERO_WEIGHTS )
Creates a cross-entropy loss using tf.nn.softmax_cross_entropy_with_logits.
weights acts as a coefficient for the loss. If a scalar is provided, then the loss is simply scaled by the given value. If
weights is a tensor of shape
[batch_size], then the loss weights apply to each corresponding sample.
label_smoothing is nonzero, smooth the labels towards 1/num_classes: new_onehot_labels = onehot_labels * (1 - label_smoothing) + label_smoothing / num_classes
[batch_size, num_classes]target one-hot-encoded labels.
[batch_size, num_classes]logits outputs of the network .
Tensorwhose rank is either 0, or rank 1 and is broadcastable to the loss which is a
label_smoothing: If greater than 0 then smooth the labels.
scope: the scope for the operations performed in computing the loss.
loss_collection: collection to which the loss will be added.
reduction: Type of reduction to apply to loss.
Tensor of the same type as
NONE, this has shape
[batch_size]; otherwise, it is scalar.
ValueError: If the shape of
logitsdoesn't match that of
onehot_labelsor if the shape of
weightsis invalid or if
weightsis None. Also if
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.