espnet2.schedulers package¶
espnet2.schedulers.__init__¶
espnet2.schedulers.noam_lr¶
Noam learning rate scheduler module.
-
class
espnet2.schedulers.noam_lr.
NoamLR
(optimizer: torch.optim.optimizer.Optimizer, model_size: Union[int, float] = 320, warmup_steps: Union[int, float] = 25000, last_epoch: int = -1)[source]¶ Bases:
torch.optim.lr_scheduler._LRScheduler
,espnet2.schedulers.abs_scheduler.AbsBatchStepScheduler
The LR scheduler proposed by Noam
- Ref:
“Attention Is All You Need”, https://arxiv.org/pdf/1706.03762.pdf
- FIXME(kamo): PyTorch doesn’t provide _LRScheduler as public class,
thus the behaviour isn’t guaranteed at forward PyTorch version.
- NOTE(kamo): The “model_size” in original implementation is derived from
the model, but in this implementation, this parameter is a constant value. You need to change it if the model is changed.
espnet2.schedulers.warmup_step_lr¶
Step (with Warm up) learning rate scheduler module.
-
class
espnet2.schedulers.warmup_step_lr.
WarmupStepLR
(optimizer: torch.optim.optimizer.Optimizer, warmup_steps: Union[int, float] = 25000, steps_per_epoch: int = 10000, step_size: int = 1, gamma: float = 0.1, last_epoch: int = -1)[source]¶ Bases:
torch.optim.lr_scheduler._LRScheduler
,espnet2.schedulers.abs_scheduler.AbsBatchStepScheduler
The WarmupStepLR scheduler.
This scheduler is the combination of WarmupLR and StepLR:
- WarmupLR:
- lr = optimizer.lr * warmup_step ** 0.5
min(step ** -0.5, step * warmup_step ** -1.5)
- WarmupStepLR:
- if step <= warmup_step:
- lr = optimizer.lr * warmup_step ** 0.5
min(step ** -0.5, step * warmup_step ** -1.5)
- else:
lr = optimizer.lr * (gamma ** (epoch//step_size))
Note that the maximum lr equals to optimizer.lr in this scheduler.
espnet2.schedulers.abs_scheduler¶
espnet2.schedulers.warmup_lr¶
Warm up learning rate scheduler module.
-
class
espnet2.schedulers.warmup_lr.
WarmupLR
(optimizer: torch.optim.optimizer.Optimizer, warmup_steps: Union[int, float] = 25000, last_epoch: int = -1)[source]¶ Bases:
torch.optim.lr_scheduler._LRScheduler
,espnet2.schedulers.abs_scheduler.AbsBatchStepScheduler
The WarmupLR scheduler
This scheduler is almost same as NoamLR Scheduler except for following difference:
- NoamLR:
- lr = optimizer.lr * model_size ** -0.5
min(step ** -0.5, step * warmup_step ** -1.5)
- WarmupLR:
- lr = optimizer.lr * warmup_step ** 0.5
min(step ** -0.5, step * warmup_step ** -1.5)
Note that the maximum lr equals to optimizer.lr in this scheduler.