W3cubDocs

/TensorFlow 2.4

tf.raw_ops.RetrieveTPUEmbeddingFTRLParametersGradAccumDebug

Retrieve FTRL embedding parameters with debug support.

An op that retrieves optimization parameters from embedding to host memory. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to retrieve updated parameters before saving a checkpoint.

Args
num_shards An int.
shard_id An int.
table_id An optional int. Defaults to -1.
table_name An optional string. Defaults to "".
config An optional string. Defaults to "".
name A name for the operation (optional).
Returns
A tuple of Tensor objects (parameters, accumulators, linears, gradient_accumulators).
parameters A Tensor of type float32.
accumulators A Tensor of type float32.
linears A Tensor of type float32.
gradient_accumulators A Tensor of type float32.

© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.4/api_docs/python/tf/raw_ops/RetrieveTPUEmbeddingFTRLParametersGradAccumDebug