W3cubDocs

/TensorFlow 1.15

Module: tf.contrib.optimizer_v2

Distribution-aware version of Optimizer.

Classes

class AdadeltaOptimizer: Optimizer that implements the Adadelta algorithm.

class AdagradOptimizer: Optimizer that implements the Adagrad algorithm.

class AdamOptimizer: Optimizer that implements the Adam algorithm.

class GradientDescentOptimizer: Optimizer that implements the gradient descent algorithm.

class MomentumOptimizer: Optimizer that implements the Momentum algorithm.

class OptimizerV2: Updated base class for optimizers.

class RMSPropOptimizer: Optimizer that implements the RMSProp algorithm.

© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/contrib/optimizer_v2