TimeDistributedΒΆ

This wrapper allows to apply a layer to every temporal slice of an input.

Abstract Signature:

TimeDistributed(layer)

PyTorch

API: β€”
Strategy: Plugin (time_distributed_shim)

Keras

API: keras.layers.TimeDistributed
Strategy: Direct Mapping

TensorFlow

API: tf.keras.layers.TimeDistributed
Strategy: Direct Mapping

Flax NNX

API: nnx.Scan
Strategy: Direct Mapping