Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for arbitrary schedulers in SequentialLR #68979

Open
marcm-ml opened this issue Nov 29, 2021 · 0 comments
Open

Support for arbitrary schedulers in SequentialLR #68979

marcm-ml opened this issue Nov 29, 2021 · 0 comments
Labels
feature A request for a proper, new feature. module: LrScheduler module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@marcm-ml
Copy link

marcm-ml commented Nov 29, 2021

馃殌 Feature

SequentialLR should allow for optional arguments in step(). This is a follow-up to #68978 which only requests support for ReduceLROnPlateau while this issue requests for a broader support of arbitrary (custom) schedulers.

Motivation

See #68978 or Lightning-AI/pytorch-lightning#10759 for one case why this is useful.

Pitch

Add ability to pass arguments to SequentialLR.step() since the underlying scheduler might need additional arguments such as ReduceLROnPlateau or other custom schedulers.

Alternatives

Keep as is and only allow scheduler that inherit from _LRScheduler. However then I propose to make this class public, see #67760

cc @vincentqb @jbschlosser @albanD

@marcm-ml marcm-ml changed the title Support for arbitrary scheduler in SequentialLR Support for arbitrary schedulers in SequentialLR Nov 29, 2021
@soulitzer soulitzer added feature A request for a proper, new feature. module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Nov 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature A request for a proper, new feature. module: LrScheduler module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

3 participants