Checkpoint
Inherits From: Checkpointable
Defined in tensorflow/contrib/eager/python/checkpointable_utils.py
.
A utility class which groups Checkpointable
objects.
Accepts arbitrary keyword arguments to its constructor and saves those values with a checkpoint. Maintains a save_counter
for numbering checkpoints.
Example usage:
import tensorflow as tf import tensorflow.contrib.eager as tfe import os checkpoint_directory = "/tmp/training_checkpoints" checkpoint_prefix = os.path.join(checkpoint_directory, "ckpt") root = tfe.Checkpoint(optimizer=optimizer, model=model) root.restore(tf.train.latest_checkpoint(checkpoint_directory)) for _ in range(num_training_steps): optimizer.minimize( ... ) root.save(file_prefix=checkpoint_prefix)
For more manual control over saving, use tfe.CheckpointableSaver
directly.
save_counter
: Incremented when save()
is called. Used to number checkpoints.save_counter
An integer variable which starts at zero and is incremented on save.
Used to number checkpoints.
The save counter variable.
__init__
__init__(**kwargs)
Group objects into a training checkpoint.
**kwargs
: Keyword arguments are set as attributes of this object, and are saved with the checkpoint. Attribute values must derive from CheckpointableBase
.ValueError
: If objects in kwargs
are not Checkpointable.__setattr__
__setattr__( name, value )
Support self.foo = checkpointable syntax.
restore
restore(save_path)
Restore a checkpoint. Wraps tfe.CheckpointableSaver.restore
.
save
save( file_prefix, session=None )
Save a checkpoint. Wraps tfe.CheckpointableSaver.save
.
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/eager/Checkpoint