IsDistributedΒΆ

Returns True if the DistributedRPC framework is initialized.

PyTorch

API: torch.is_distributed
Strategy: Direct Mapping

JAX (Core)

API: β€”
Strategy: Macro 'False'

NumPy

API: β€”
Strategy: Macro 'False'

Keras

API: β€”
Strategy: Macro 'False'

TensorFlow

API: β€”
Strategy: Macro 'False'

Apple MLX

API: β€”
Strategy: Macro 'False'

Flax NNX

API: β€”
Strategy: Macro 'False'

PaxML / Praxis

API: β€”
Strategy: Macro 'False'