GradScalerΒΆ
Utility for gradient scaling in mixed precision training.
Abstract Signature:
GradScaler(init_scale: float = 65536.0, growth_factor: float = 2.0, backoff_factor: float = 0.5, growth_interval: int = 2000, enabled: bool = True)
NumPy
API:
βStrategy: Custom / Partial
Apple MLX
API:
βStrategy: Custom / Partial