kempnerforge.training.scheduler¶
Learning rate schedulers for KempnerForge.
All schedulers are step-based (not epoch-based) and compose a warmup phase with a decay phase:
Cosine: warmup → cosine decay to min_lr
Linear: warmup → linear decay to min_lr
WSD: warmup → stable → decay to min_lr (cosine/linear/sqrt cooldown)
Constant: warmup → flat LR (for ablations)
REX: warmup → polynomial decay (1 - t/T)^alpha
None: constant factor=1.0 (for schedule-free optimizers)
Functions
|
Build a LR scheduler from config. |
- kempnerforge.training.scheduler.build_scheduler(optimizer, config, max_steps)[source]¶
Build a LR scheduler from config.
- Parameters:
optimizer (torch.optim.Optimizer) – Optimizer to schedule.
config (SchedulerConfig) – Scheduler configuration.
max_steps (int) – Total training steps (used to compute decay length).
- Returns:
PyTorch LambdaLR scheduler.
- Return type: