kempnerforge.training.scheduler

Learning rate schedulers for KempnerForge.

All schedulers are step-based (not epoch-based) and compose a warmup phase with a decay phase:

  • Cosine: warmup → cosine decay to min_lr

  • Linear: warmup → linear decay to min_lr

  • WSD: warmup → stable → decay to min_lr (cosine/linear/sqrt cooldown)

  • Constant: warmup → flat LR (for ablations)

  • REX: warmup → polynomial decay (1 - t/T)^alpha

  • None: constant factor=1.0 (for schedule-free optimizers)

Functions

build_scheduler(optimizer, config, max_steps)

Build a LR scheduler from config.

kempnerforge.training.scheduler.build_scheduler(optimizer, config, max_steps)[source]

Build a LR scheduler from config.

Parameters:
Returns:

PyTorch LambdaLR scheduler.

Return type:

torch.optim.lr_scheduler.LambdaLR