Defined in tensorflow/contrib/optimizer_v2/optimizer_v2_symbols.py
.
Distribution-aware version of Optimizer.
class AdadeltaOptimizer
: Optimizer that implements the Adadelta algorithm.
class AdagradOptimizer
: Optimizer that implements the Adagrad algorithm.
class AdamOptimizer
: Optimizer that implements the Adam algorithm.
class GradientDescentOptimizer
: Optimizer that implements the gradient descent algorithm.
class MomentumOptimizer
: Optimizer that implements the Momentum algorithm.
class OptimizerV2
: Updated base class for optimizers.
class RMSPropOptimizer
: Optimizer that implements the RMSProp algorithm.
__cached__
__loader__
__spec__
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/optimizer_v2