Built-in optimizer classes.
schedules
module: Public API for tf.keras.optimizers.schedules namespace.
class Adadelta
: Optimizer that implements the Adadelta algorithm.
class Adagrad
: Optimizer that implements the Adagrad algorithm.
class Adam
: Optimizer that implements the Adam algorithm.
class Adamax
: Optimizer that implements the Adamax algorithm.
class Ftrl
: Optimizer that implements the FTRL algorithm.
class Nadam
: Optimizer that implements the NAdam algorithm.
class Optimizer
: Updated base class for optimizers.
class RMSprop
: Optimizer that implements the RMSprop algorithm.
class SGD
: Stochastic gradient descent and momentum optimizer.
deserialize(...)
: Inverse of the serialize
function.
get(...)
: Retrieves a Keras Optimizer instance.
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/optimizers