ml_switcheroo.cli.handlers.suggest ================================== .. py:module:: ml_switcheroo.cli.handlers.suggest .. autoapi-nested-parse:: Suggest Command Handlers. This module implements the `suggest` command, which generates a context-rich prompt for Large Language Models (LLMs). The prompt includes: 1. Introspection data for the requested API (signatures, docstrings). 2. The ODL JSON Schema structure. 3. A Few-Shot example of a correct ODL definition. This output is designed to be piped directly to an LLM to generate valid Operation Definition Language (YAML) for missing operations. Functions --------- .. autoapisummary:: ml_switcheroo.cli.handlers.suggest.handle_suggest Module Contents --------------- .. py:function:: handle_suggest(api_path: str, out_dir: Optional[pathlib.Path] = None, batch_size: int = 50) -> int Generates an LLM prompt for defining new operations. Supports both single API paths (e.g. ``torch.nn.Linear``) and module wildcards (e.g. ``jax.numpy.*``). Steps: 1. Resolves target objects (single or list from wildcard). 2. Inspects live Python objects to get signatures and docs. 3. Retrieves the JSON Schema for ``OperationDef``. 4. Constructs structured prompts with Header, Batched Ops, and Footer. 5. Writes output to stdout or files if out_dir specified. :param api_path: The path to inspect. Can be a dotted path to an object or a module path ending in ``.*``. :param out_dir: Optional directory to save batched .md files. :param batch_size: Number of operations per batch/file. :returns: Exit code (0 for success, 1 for failure). :rtype: int