Cross-entropy loss using
tf.compat.v1.losses.sparse_softmax_cross_entropy( labels, logits, weights=1.0, scope=None, loss_collection=tf.GraphKeys.LOSSES, reduction=Reduction.SUM_BY_NONZERO_WEIGHTS )
weights acts as a coefficient for the loss. If a scalar is provided, then the loss is simply scaled by the given value. If
weights is a tensor of shape
[batch_size], then the loss weights apply to each corresponding sample.
| || |
| || Unscaled log probabilities of shape |
| || Coefficients for the loss. This must be scalar or broadcastable to |
| ||the scope for the operations performed in computing the loss.|
| ||collection to which the loss will be added.|
| ||Type of reduction to apply to loss.|
| Weighted loss |
| || If the shapes of |
loss_collection argument is ignored when executing eagerly. Consider holding on to the return value or collecting losses via a
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.