# W3cubDocs

/TensorFlow Python

```tf.contrib.gan.losses.wargs.combine_adversarial_loss(
main_loss,
weight_factor=None,
variables=None,
scalar_summaries=True,
scope=None
)
```

Utility to combine main and adversarial losses.

This utility combines the main and adversarial losses in one of two ways. 1) Fixed coefficient on adversarial loss. Use `weight_factor` in this case. 2) Fixed ratio of gradients. Use `gradient_ratio` in this case. This is often used to make sure both losses affect weights roughly equally, as in https://arxiv.org/pdf/1705.05823.

One can optionally also visualize the scalar and gradient behavior of the losses.

#### Args:

• `main_loss`: A floating scalar Tensor indicating the main loss.
• `adversarial_loss`: A floating scalar Tensor indication the adversarial loss.
• `weight_factor`: If not `None`, the coefficient by which to multiply the adversarial loss. Exactly one of this and `gradient_ratio` must be non-None.
• `gradient_ratio`: If not `None`, the ratio of the magnitude of the gradients. Specifically, gradient_ratio = grad_mag(main_loss) / grad_mag(adversarial_loss) Exactly one of this and `weight_factor` must be non-None.
• `gradient_ratio_epsilon`: An epsilon to add to the adversarial loss coefficient denominator, to avoid division-by-zero.
• `variables`: List of variables to calculate gradients with respect to. If not present, defaults to all trainable variables.
• `scalar_summaries`: Create scalar summaries of losses.
• `gradient_summaries`: Create gradient summaries of losses.
• `scope`: Optional name scope.

#### Returns:

A floating scalar Tensor indicating the desired combined loss.

#### Raises:

• `ValueError`: Malformed input.