ml_switcheroo.plugins.mlx_extras

Plugin for MLX Ecosystem Mapping.

Handles: 1. Compilation: @torch.compile -> @mx.compile. 2. Eager Evaluation: torch.cuda.synchronize() -> Warning/No-op.

Functions

transform_compiler(→ libcst.CSTNode)

Hook: Maps JIT compilation decorators.

transform_synchronize(→ libcst.CSTNode)

Hook: Maps barrier synchronization to a warning.

Module Contents

ml_switcheroo.plugins.mlx_extras.transform_compiler(node: libcst.Decorator | libcst.Call, ctx: ml_switcheroo.core.hooks.HookContext) libcst.CSTNode[source]

Hook: Maps JIT compilation decorators.

Triggers: Operations mapped with requires_plugin: “mlx_compiler”.

Transformation:

Input: @torch.compile(fullgraph=True, dynamic=True) Output: @mx.compile (stripping incompatible kwargs).

Decoupling:

Looks up the Compile operation API in semantics (e.g. mlx.core.compile).

ml_switcheroo.plugins.mlx_extras.transform_synchronize(node: libcst.Call, ctx: ml_switcheroo.core.hooks.HookContext) libcst.CSTNode[source]

Hook: Maps barrier synchronization to a warning.

MLX is lazy, but torch.cuda.synchronize() implies a global device barrier. Equivalent mx.eval() requires arguments. Since we cannot infer state variables here, we replace the call with a print statement to alert the user.