tf.contrib.losses.sparse_softmax_cross_entropy( logits, labels, weights=1.0, scope=None )
Cross-entropy loss using
THIS FUNCTION IS DEPRECATED. It will be removed after 2016-12-30. Instructions for updating: Use tf.losses.sparse_softmax_cross_entropy instead. Note that the order of the logits and labels arguments has been changed.
weights acts as a coefficient for the loss. If a scalar is provided, then the loss is simply scaled by the given value. If
weights is a tensor of size [
batch_size], then the loss weights apply to each corresponding sample.
logits: [batch_size, num_classes] logits outputs of the network .
labels: [batch_size, 1] or [batch_size] labels of dtype
int64in the range
weights: Coefficients for the loss. The tensor must be a scalar or a tensor of shape [batch_size] or [batch_size, 1].
scope: the scope for the operations performed in computing the loss.
Tensor representing the mean loss value.
ValueError: If the shapes of
weightsare incompatible, or if
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.