AdamWΒΆ
Optimizer that implements the AdamW algorithm.
Abstract Signature:
AdamW(learning_rate: float = 0.001, weight_decay: float = 0.004, beta_1: float = 0.9, beta_2: float = 0.999, epsilon: float = 1e-07, amsgrad: boolean = False)
Optimizer that implements the AdamW algorithm.
Abstract Signature:
AdamW(learning_rate: float = 0.001, weight_decay: float = 0.004, beta_1: float = 0.9, beta_2: float = 0.999, epsilon: float = 1e-07, amsgrad: boolean = False)