Partitioner to specify a fixed number of shards along given axis.
tf.compat.v1.fixed_size_partitioner(
    num_shards, axis=0
)
 Migrate to TF2
This API is deprecated in TF2. In TF2, partitioner is no longer part of the variable declaration via tf.Variable. ParameterServer Training handles partitioning of variables. The corresponding TF2 partitioner class of fixed_size_partitioner is tf.distribute.experimental.partitioners.FixedShardsPartitioner.
Check the migration guide on the differences in treatment of variables and losses between TF1 and TF2.
Before:
x = tf.compat.v1.get_variable( "x", shape=(2,), partitioner=tf.compat.v1.fixed_size_partitioner(2) )
After:
partitioner = (
    tf.distribute.experimental.partitioners.FixedShardsPartitioner(
        num_shards=2)
)
strategy = tf.distribute.experimental.ParameterServerStrategy(
               cluster_resolver=cluster_resolver,
               variable_partitioner=partitioner)
with strategy.scope():
  x = tf.Variable([1.0, 2.0])
  | Args | |
|---|---|
| num_shards | int, number of shards to partition variable. | 
| axis | int, axis to partition on. | 
| Returns | |
|---|---|
| A partition function usable as the partitionerargument tovariable_scopeandget_variable. | 
    © 2022 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 4.0.
Code samples licensed under the Apache 2.0 License.
    https://www.tensorflow.org/versions/r2.9/api_docs/python/tf/compat/v1/fixed_size_partitioner