W3cubDocs

/TensorFlow 2.3

Module: tf.compat.v1.distribute.experimental

Experimental Distribution Strategy library.

Classes

class CentralStorageStrategy: A one-machine strategy that puts all variables on a single device.

class CollectiveCommunication: Communication choices for CollectiveOps.

class CollectiveHints: Hints for collective operations like AllReduce.

class MultiWorkerMirroredStrategy: A distribution strategy for synchronous training on multiple workers.

class ParameterServerStrategy: An asynchronous multi-worker parameter server tf.distribute strategy.

class TPUStrategy: TPU distribution strategy implementation.

© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/compat/v1/distribute/experimental