W3cubDocs

/TensorFlow 2.3

Module: tf.distribute.experimental

Experimental Distribution Strategy library.

Classes

class CentralStorageStrategy: A one-machine strategy that puts all variables on a single device.

class CollectiveCommunication: Communication choices for CollectiveOps.

class CollectiveHints: Hints for collective operations like AllReduce.

class MultiWorkerMirroredStrategy: A distribution strategy for synchronous training on multiple workers.

class ParameterServerStrategy: An asynchronous multi-worker parameter server tf.distribute strategy.

class TPUStrategy: Synchronous training on TPUs and TPU Pods.

class ValueContext: A class wrapping information needed by a distribute function.

© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/distribute/experimental