You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you want to give the dataloaders as parameters during fitting (so training_step, validation_step are defined but not train_dataloader and val_dataloader), if you want to do a learning rate finder, it return you the following error : pytorch_lightning.utilities.exceptions.MisconfigurationException: You have defined 'validation_step()', but have not passed in a val_dataloader().
Code sample
import os
import torch
from torch.nn import functional as F
from torch.utils.data import DataLoader
from torchvision.datasets import MNIST
from torchvision import transforms
import pytorch_lightning as pl
from pytorch_lightning import Trainer
class LitModel(pl.LightningModule):
def __init__(self):
super().__init__()
self.l1 = torch.nn.Linear(28 * 28, 10)
def forward(self, x):
return torch.relu(self.l1(x.view(x.size(0), -1)))
def training_step(self, batch, batch_idx):
x, y = batch
y_hat = self(x)
loss = F.cross_entropy(y_hat, y)
tensorboard_logs = {'train_loss': loss}
return {'loss': loss, 'log': tensorboard_logs}
def validation_step(self, batch, batch_idx):
x, y = batch
y_hat = self(x)
return {'val_loss': F.cross_entropy(y_hat, y)}
def validation_epoch_end(self, outputs):
avg_loss = torch.stack([x['val_loss'] for x in outputs]).mean()
tensorboard_logs = {'val_loss': avg_loss}
return {'val_loss': avg_loss, 'log': tensorboard_logs}
def configure_optimizers(self):
return torch.optim.Adam(self.parameters(), lr=0.001)
train_dataset = MNIST(os.getcwd(), train=True, download=True, transform=transforms.ToTensor())
train_loader = DataLoader(train_dataset, batch_size=32, num_workers=4, shuffle=True)
val_dataset = MNIST(os.getcwd(), train=False, download=True, transform=transforms.ToTensor())
val_loader = DataLoader(val_dataset, batch_size=32, num_workers=4, shuffle=True)
model = LitModel()
trainer = Trainer(gpus=1)
lr = trainer.lr_find(model, train_loader)
The quick fix is to make the validation set a part of your model i.e. define the val_dataloader method in the model. However, you are probably right that it should be possible to run the lr_find method without this workaround.
🐛 Bug
To Reproduce
If you want to give the dataloaders as parameters during fitting (so
training_step
,validation_step
are defined but nottrain_dataloader
andval_dataloader
), if you want to do a learning rate finder, it return you the following error :pytorch_lightning.utilities.exceptions.MisconfigurationException: You have defined 'validation_step()', but have not passed in a val_dataloader().
Code sample
Expected behavior
Simply determines the best learning rate
Environment
The text was updated successfully, but these errors were encountered: