Support for arbitrary schedulers in SequentialLR #68979
Labels
feature
A request for a proper, new feature.
module: LrScheduler
module: optimizer
Related to torch.optim
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
馃殌 Feature
SequentialLR should allow for optional arguments in step(). This is a follow-up to #68978 which only requests support for ReduceLROnPlateau while this issue requests for a broader support of arbitrary (custom) schedulers.
Motivation
See #68978 or Lightning-AI/pytorch-lightning#10759 for one case why this is useful.
Pitch
Add ability to pass arguments to SequentialLR.step() since the underlying scheduler might need additional arguments such as ReduceLROnPlateau or other custom schedulers.
Alternatives
Keep as is and only allow scheduler that inherit from _LRScheduler. However then I propose to make this class public, see #67760
cc @vincentqb @jbschlosser @albanD
The text was updated successfully, but these errors were encountered: