ml_switcheroo.cli.handlers.suggest¶

Suggest Command Handlers.

This module implements the suggest command, which generates a context-rich prompt for Large Language Models (LLMs). The prompt includes: 1. Introspection data for the requested API (signatures, docstrings). 2. The ODL JSON Schema structure. 3. A Few-Shot example of a correct ODL definition.

This output is designed to be piped directly to an LLM to generate valid Operation Definition Language (YAML) for missing operations.

Functions¶

handle_suggest(→ int)

Generates an LLM prompt for defining new operations.

Module Contents¶

ml_switcheroo.cli.handlers.suggest.handle_suggest(api_path: str, out_dir: pathlib.Path | None = None, batch_size: int = 50) → int¶

Generates an LLM prompt for defining new operations.

Supports both single API paths (e.g. torch.nn.Linear) and module wildcards (e.g. jax.numpy.*).

Steps: 1. Resolves target objects (single or list from wildcard). 2. Inspects live Python objects to get signatures and docs. 3. Retrieves the JSON Schema for OperationDef. 4. Constructs structured prompts with Header, Batched Ops, and Footer. 5. Writes output to stdout or files if out_dir specified.

Parameters:
  • api_path – The path to inspect. Can be a dotted path to an object or a module path ending in .*.

  • out_dir – Optional directory to save batched .md files.

  • batch_size – Number of operations per batch/file.

Returns:

Exit code (0 for success, 1 for failure).

Return type:

int