IsDistributed ============= Returns True if the DistributedRPC framework is initialized. .. raw:: html

PyTorch

API: torch.is_distributed
Strategy: Direct Mapping

JAX (Core)

API:
Strategy: Macro 'False'

NumPy

API:
Strategy: Macro 'False'

Keras

API:
Strategy: Macro 'False'

TensorFlow

API:
Strategy: Macro 'False'

Apple MLX

API:
Strategy: Macro 'False'

Flax NNX

API:
Strategy: Macro 'False'

PaxML / Praxis

API:
Strategy: Macro 'False'