Defined in tensorflow/contrib/opt/__init__.py.
A module containing optimization routines.
class AddSignOptimizer: Optimizer that implements the AddSign update.
class DropStaleGradientOptimizer: Wrapper optimizer that checks and drops stale gradient.
class ElasticAverageCustomGetter: Custom_getter class is used to do:
class ElasticAverageOptimizer: Wrapper optimizer that implements the Elastic Average SGD algorithm.
class ExternalOptimizerInterface: Base class for interfaces with external optimization algorithms.
class LazyAdamOptimizer: Variant of the Adam optimizer that handles sparse updates more efficiently.
class ModelAverageCustomGetter: Custom_getter class is used to do.
class ModelAverageOptimizer: Wrapper optimizer that implements the Model Average algorithm.
class MovingAverageOptimizer: Optimizer that computes a moving average of the variables.
class MultitaskOptimizerWrapper: Optimizer wrapper making all-zero gradients harmless.
class NadamOptimizer: Optimizer that implements the Nadam algorithm.
class PowerSignOptimizer: Optimizer that implements the PowerSign update.
class ScipyOptimizerInterface: Wrapper allowing scipy.optimize.minimize to operate a tf.Session.
class VariableClippingOptimizer: Wrapper optimizer that clips the norm of specified variables after update.
clip_gradients_by_global_norm(...): Clips gradients of a multitask loss by their global norm.
__cached__
__loader__
__spec__
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/opt