W3cubDocs

/TensorFlow Python

tf.contrib.opt.clip_gradients_by_global_norm

tf.contrib.opt.clip_gradients_by_global_norm(
    gradients_variables,
    clip_norm=20.0
)

Defined in tensorflow/contrib/opt/python/training/multitask_optimizer_wrapper.py.

Clips gradients of a multitask loss by their global norm.

Ignores all-zero tensors when computing the global norm.

Args:

  • gradients_variables: a list of pairs (gradient, variable).
  • clip_norm: a float Tensor, the global norm to clip on. Default is 20.0.

Returns:

  • list: A list of pairs of the same type as gradients_variables,.
  • fixed_global_norm: A 0-D (scalar) Tensor representing the global norm.

© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/opt/clip_gradients_by_global_norm