Warm-starts a model using the given settings.
tf.train.warm_start( ckpt_to_initialize_from, vars_to_warm_start='.*', var_name_to_vocab_info=None, var_name_to_prev_var_name=None )
If you are using a tf.estimator.Estimator, this will automatically be called during training.
Args | |
---|---|
ckpt_to_initialize_from | [Required] A string specifying the directory with checkpoint file(s) or path to checkpoint from which to warm-start the model parameters. |
vars_to_warm_start | [Optional] One of the following:
Defaults to |
var_name_to_vocab_info | [Optional] Dict of variable names (strings) to tf.estimator.VocabInfo . The variable names should be "full" variables, not the names of the partitions. If not explicitly provided, the variable is assumed to have no (changes to) vocabulary. |
var_name_to_prev_var_name | [Optional] Dict of variable names (strings) to name of the previously-trained variable in ckpt_to_initialize_from . If not explicitly provided, the name of the variable is assumed to be same between previous checkpoint and current model. Note that this has no effect on the set of variables that is warm-started, and only controls name mapping (use vars_to_warm_start for controlling what variables to warm-start). |
Raises | |
---|---|
ValueError | If the WarmStartSettings contains prev_var_name or VocabInfo configuration for variable names that are not used. This is to ensure a stronger check for variable configuration than relying on users to examine the logs. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/train/warm_start