Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Too large step_size in deepedit #1430

Open
KumoLiu opened this issue Jun 19, 2023 · 3 comments
Open

Too large step_size in deepedit #1430

KumoLiu opened this issue Jun 19, 2023 · 3 comments

Comments

@KumoLiu
Copy link
Contributor

KumoLiu commented Jun 19, 2023

Describe the bug
Too large step_size which may cause the learning rate never decreases unless the model is trained for 5000 epochs.

lr_scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=5000, gamma=0.1)

Expected behavior
Set smaller step_size or set epoch_level=False in LrScheduleHandler

Originally posted in Project-MONAI/MONAI#6625

@KumoLiu
Copy link
Contributor Author

KumoLiu commented Jun 19, 2023

Hi @diazandr3s, could you please help take a look at this issue?
Thanks in advance!

@diazandr3s
Copy link
Contributor

Hi @KumoLiu,

Thanks for the ping. Good point!
Have you tried reducing the step_size? you may also want to consider changing the learning rate and/or optimizer.
As you may know, intensity normalization may also affect the training.

@KumoLiu
Copy link
Contributor Author

KumoLiu commented Jun 20, 2023

Hi @diazandr3s, I didn't try to reduce the step_size, I'm assuming there's a typo here because the value of 5000 is set too high if it's epoch-level.
And if it's a typo, I can submit a PR to fix it.

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants