Optimizer
Defined in tensorflow/python/keras/_impl/keras/optimizers.py.
Abstract optimizer base class.
Note: this is the parent class of all optimizers, not an actual optimizer that can be used for training models.
All Keras optimizers support the following keyword arguments:
clipnorm: float >= 0. Gradients will be clipped
when their L2 norm exceeds this value.
clipvalue: float >= 0. Gradients will be clipped
when their absolute value exceeds this value.
__init____init__(**kwargs)
Initialize self. See help(type(self)) for accurate signature.
from_config@classmethod
from_config(
cls,
config
)
get_configget_config()
get_gradientsget_gradients(
loss,
params
)
Returns gradients of loss with respect to params.
loss: Loss tensor.params: List of variables.List of gradient tensors.
ValueError: In case any gradient cannot be computed (e.g. if gradient function not implemented).get_updatesget_updates(
loss,
params
)
get_weightsget_weights()
Returns the current value of the weights of the optimizer.
A list of numpy arrays.
set_weightsset_weights(weights)
Sets the weights of the optimizer, from Numpy arrays.
Should only be called after computing the gradients (otherwise the optimizer has no weights).
weights: a list of Numpy arrays. The number of arrays and their shape must match number of the dimensions of the weights of the optimizer (i.e. it should match the output of get_weights).ValueError: in case of incompatible weight shapes.
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/Optimizer