tf.losses.compute_weighted_loss( losses, weights=1.0, scope=None, loss_collection=tf.GraphKeys.LOSSES, reduction=Reduction.SUM_BY_NONZERO_WEIGHTS )
Defined in tensorflow/python/ops/losses/losses_impl.py
.
Computes the weighted loss.
losses
: Tensor
of shape [batch_size, d1, ... dN]
.weights
: Optional Tensor
whose rank is either 0, or the same rank as losses
, and must be broadcastable to losses
(i.e., all dimensions must be either 1
, or the same as the corresponding losses
dimension).scope
: the scope for the operations performed in computing the loss.loss_collection
: the loss will be added to these collections.reduction
: Type of reduction to apply to loss.Weighted loss Tensor
of the same type as losses
. If reduction
is NONE
, this has the same shape as losses
; otherwise, it is scalar.
ValueError
: If weights
is None
or the shape is not compatible with losses
, or if the number of dimensions (rank) of either losses
or weights
is missing.Note: When calculating the gradient of a weighted loss contributions from bothlosses
andweights
are considered. If yourweights
depend on some model parameters but you do not want this to affect the loss gradient, you need to applytf.stop_gradient
toweights
before passing them tocompute_weighted_loss
.
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/losses/compute_weighted_loss