We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
On catalyst training with scheduler a warning shows.
OS: Windows Python 3.7.8 catalyst==20.10.1 torch==1.6.0
import torch from torch import nn import torch.nn.functional as F from torch.utils.data import TensorDataset, DataLoader from catalyst import dl X = torch.randn(1000, 10) y = torch.rand(X.shape[0]) model = nn.Linear(X.shape[1], 1) optimizer = torch.optim.Adam(model.parameters()) scheduler = torch.optim.lr_scheduler.ExponentialLR(optimizer, gamma=0.9) dataset = TensorDataset(X, y) loaders = { 'train': DataLoader(dataset, batch_size=32, shuffle=True), 'valid': DataLoader(dataset, batch_size=32) } class CustomRunner(dl.SupervisedRunner): def _handle_batch(self, batch): # model train/valid step y_pred = self.model(batch['features']).view(-1) loss = F.mse_loss(y_pred, batch['targets']) self.batch_metrics.update( {"loss": loss} ) runner = CustomRunner() runner.train( model=model, loaders=loaders, optimizer=optimizer, scheduler=scheduler, num_epochs=20 )
No warnings
PS
The text was updated successfully, but these errors were encountered:
#980 has chances to fix this problem
Sorry, something went wrong.
Should be fixed now
ditwoo
Scitator
bagxi
No branches or pull requests
馃悰 Bug Report
On catalyst training with scheduler a warning shows.
How To Reproduce
Screenshots
Expected behavior
No warnings
PS
The text was updated successfully, but these errors were encountered: