Abstract optimizer base class.
Note: this is the parent class of all optimizers, not an actual optimizer that can be used for training models.
All Keras optimizers support the following keyword arguments:
clipnorm: float >= 0. Gradients will be clipped when their L2 norm exceeds this value. clipvalue: float >= 0. Gradients will be clipped when their absolute value exceeds this value.
Initialize self. See help(type(self)) for accurate signature.
@classmethod from_config( cls, config )
get_gradients( loss, params )
Returns gradients of
loss with respect to
loss: Loss tensor.
params: List of variables.
List of gradient tensors.
ValueError: In case any gradient cannot be computed (e.g. if gradient function not implemented).
get_updates( loss, params )
Returns the current value of the weights of the optimizer.
A list of numpy arrays.
Sets the weights of the optimizer, from Numpy arrays.
Should only be called after computing the gradients (otherwise the optimizer has no weights).
weights: a list of Numpy arrays. The number of arrays and their shape must match number of the dimensions of the weights of the optimizer (i.e. it should match the output of
ValueError: in case of incompatible weight shapes.
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.