W3cubDocs

/TensorFlow 2.4

Module: tf.keras.optimizers

Built-in optimizer classes.

For more examples see the base class tf.keras.optimizers.Optimizer.

Modules

schedules module: Public API for tf.keras.optimizers.schedules namespace.

Classes

class Adadelta: Optimizer that implements the Adadelta algorithm.

class Adagrad: Optimizer that implements the Adagrad algorithm.

class Adam: Optimizer that implements the Adam algorithm.

class Adamax: Optimizer that implements the Adamax algorithm.

class Ftrl: Optimizer that implements the FTRL algorithm.

class Nadam: Optimizer that implements the NAdam algorithm.

class Optimizer: Base class for Keras optimizers.

class RMSprop: Optimizer that implements the RMSprop algorithm.

class SGD: Gradient descent (with momentum) optimizer.

Functions

deserialize(...): Inverse of the serialize function.

get(...): Retrieves a Keras Optimizer instance.

serialize(...)

© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.4/api_docs/python/tf/keras/optimizers