ml_switcheroo.frameworks.common.data¶

Data Loader Standard & Runtime Shim.

This module defines the Generic Data Loader Shim used when transpiling PyTorch DataLoader code to frameworks that lack a direct equivalent (like JAX or NumPy). It also provides the Semantic Configuration injection to ensure the engine detects the DataLoader API.

Capabilities handled by the Shim: 1. Batching: batch_size. 2. Shuffling: shuffle. 3. Dropping Last: drop_last. 4. Dataset Protocol: Supports __len__ and __getitem__. 5. Multi-Processing Stubs: num_workers, pin_memory, persistent_workers

are accepted as no-ops to ensure compatibility with performance-tuned Torch code.

The Shim is designed to be a lightweight iterator yielding collated batches.

Functions¶

get_dataloader_semantics(→ Dict[str, Any])

Returns the Semantic Definition for the DataLoader.

get_shim_code(→ str)

Returns the source code for the GenericDataLoader class.

Module Contents¶

ml_switcheroo.frameworks.common.data.get_dataloader_semantics() → Dict[str, Any]¶

Returns the Semantic Definition for the DataLoader.

Now includes performance arguments found in standard Torch examples. These are mapped to the Shim, which handles them gracefully (usually ignoring them).

ml_switcheroo.frameworks.common.data.get_shim_code() → str¶

Returns the source code for the GenericDataLoader class. This code is injected into generated files by the convert_dataloader plugin.

Updates: - Added num_workers, pin_memory, persistent_workers to __init__. - Included collate_fn stub support.