tf.losses.compute_weighted_loss( losses, weights=1.0, scope=None, loss_collection=tf.GraphKeys.LOSSES, reduction=Reduction.SUM_BY_NONZERO_WEIGHTS )
Computes the weighted loss.
[batch_size, d1, ... dN].
Tensorwhose rank is either 0, or the same rank as
losses, and must be broadcastable to
losses(i.e., all dimensions must be either
1, or the same as the corresponding
scope: the scope for the operations performed in computing the loss.
loss_collection: the loss will be added to these collections.
reduction: Type of reduction to apply to loss.
Tensor of the same type as
NONE, this has the same shape as
losses; otherwise, it is scalar.
Noneor the shape is not compatible with
losses, or if the number of dimensions (rank) of either
Note: When calculating the gradient of a weighted loss contributions from both
weightsare considered. If your
weightsdepend on some model parameters but you do not want this to affect the loss gradient, you need to apply
weightsbefore passing them to
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.