tf.contrib.losses.softmax_cross_entropy( logits, onehot_labels, weights=1.0, label_smoothing=0, scope=None )
Creates a cross-entropy loss using tf.nn.softmax_cross_entropy_with_logits. (deprecated)
THIS FUNCTION IS DEPRECATED. It will be removed after 2016-12-30. Instructions for updating: Use tf.losses.softmax_cross_entropy instead. Note that the order of the logits and labels arguments has been changed.
weights acts as a coefficient for the loss. If a scalar is provided, then the loss is simply scaled by the given value. If
weights is a tensor of size [
batch_size], then the loss weights apply to each corresponding sample.
label_smoothing is nonzero, smooth the labels towards 1/num_classes: new_onehot_labels = onehot_labels * (1 - label_smoothing) + label_smoothing / num_classes
logits: [batch_size, num_classes] logits outputs of the network .
onehot_labels: [batch_size, num_classes] one-hot-encoded labels.
weights: Coefficients for the loss. The tensor must be a scalar or a tensor of shape [batch_size].
label_smoothing: If greater than 0 then smooth the labels.
scope: the scope for the operations performed in computing the loss.
Tensor representing the mean loss value.
ValueError: If the shape of
logitsdoesn't match that of
onehot_labelsor if the shape of
weightsis invalid or if
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.