Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

save initial arguments #4163

Merged
merged 4 commits into from Oct 15, 2020
Merged

save initial arguments #4163

merged 4 commits into from Oct 15, 2020

Conversation

Borda
Copy link
Member

@Borda Borda commented Oct 15, 2020

What does this PR do?

the case is that some arguments can be DataModules which have in runtime assigned Trainer instance, that yield to saving error, but this instance is not needed for model reconstruction, so we save as hpramas only values from the init time, namely when the save_hyperparameters is called

Otherwise, the save_hyperparameters shall be renamed to register_hyperparameters_for_saving

Preventing #4156

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 馃檭

@Borda Borda added bug Something isn't working feature Is an improvement or enhancement labels Oct 15, 2020
@Borda Borda added this to the 1.0.x milestone Oct 15, 2020
@mergify mergify bot requested a review from a team October 15, 2020 06:54
@codecov
Copy link

codecov bot commented Oct 15, 2020

Codecov Report

Merging #4163 into master will increase coverage by 0%.
The diff coverage is 100%.

@@          Coverage Diff           @@
##           master   #4163   +/-   ##
======================================
  Coverage      93%     93%           
======================================
  Files         103     103           
  Lines        7798    7805    +7     
======================================
+ Hits         7232    7239    +7     
  Misses        566     566           

@williamFalcon williamFalcon merged commit f064682 into master Oct 15, 2020
@Borda Borda deleted the bugfix/4156_initial-hparams branch October 15, 2020 12:39
@Borda Borda mentioned this pull request Oct 15, 2020
@awaelchli
Copy link
Member

If we run the tuner and it sets a new learning rate, which one will be loaded when restoring the checkpoint?

@Borda
Copy link
Member Author

Borda commented Oct 15, 2020

If we run the tuner and it sets a new learning rate, which one will be loaded when restoring the checkpoint?

well LR is part or Trainer mot Model, right?
also, this is for logging not checkpointing...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working feature Is an improvement or enhancement
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants