init_optimizers(self, model) #5977
clmartinez151
started this conversation in
General
Replies: 2 comments
-
I guess it's not a valid config. you can pass it like: opt1 = ...
opt2 = ...
sched1 = {'scheduler': ..., monitor: ...}
sched2 = {'scheduler': ..., monitor: ...}
return (
{'optimizer': opt1, 'lr_scheduler': sched1},
{'optimizer': opt2, 'lr_scheduler': sched2},
) |
Beta Was this translation helpful? Give feedback.
0 replies
-
Dear @clmartinez151, I think @rohitgr7 answered your question, closing this issue for now. Best, |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
🐛 Bug
pytorch_lightning/trainer/optimizers.py in init_optimizers(self, model) fails to load monitor value when configure_optimizers() returns a tuple with multiple optimizers in dictionary form, each with its own LR Scheduler and Monitor Value. The bug seems to occur in the
elif
clause in line 56, where no monitor value is attempted to be extracted, so monitor val is always None in this case.Please reproduce using SomeModel class
Example class below:
Expected behavior
You get an exception when trainer attempts to load model.
**---------------------------------------------------------------------------
MisconfigurationException Traceback (most recent call last)
in
53 limit_test_batches=0.01)
54
---> 55 trainer.fit(model)
56 trainer.test(model)
57 model_metrics.append(trainer.progress_bar_metrics)
/net/10.57.1.2/vol/homes/martinezniev1/complexcode/env/lib64/python3.6/site-packages/pytorch_lightning/trainer/trainer.py in fit(self, model, train_dataloader, val_dataloaders, datamodule)
492 # ----------------------------
493 self.accelerator_backend = self.accelerator_connector.select_accelerator()
--> 494 self.accelerator_backend.setup(model)
495
496 # ----------------------------
/net/10.57.1.2/vol/homes/martinezniev1/complexcode/env/lib64/python3.6/site-packages/pytorch_lightning/accelerators/dp_accelerator.py in setup(self, model)
53 # CHOOSE OPTIMIZER
54 # allow for lr schedulers as well
---> 55 self.setup_optimizers(model)
56
57 # init torch data parallel
/net/10.57.1.2/vol/homes/martinezniev1/complexcode/env/lib64/python3.6/site-packages/pytorch_lightning/accelerators/accelerator.py in setup_optimizers(self, model)
148 return
149
--> 150 optimizers, lr_schedulers, optimizer_frequencies = self.trainer.init_optimizers(model)
151 self.trainer.optimizers = optimizers
152 self.trainer.lr_schedulers = lr_schedulers
/net/10.57.1.2/vol/homes/martinezniev1/complexcode/env/lib64/python3.6/site-packages/pytorch_lightning/trainer/optimizers.py in init_optimizers(self, model)
78 )
79
---> 80 lr_schedulers = self.configure_schedulers(lr_schedulers, monitor=monitor)
81 _validate_scheduler_optimizer(optimizers, lr_schedulers)
82
/net/10.57.1.2/vol/homes/martinezniev1/complexcode/env/lib64/python3.6/site-packages/pytorch_lightning/trainer/optimizers.py in configure_schedulers(self, schedulers, monitor)
130 if monitor is None:
131 raise MisconfigurationException(
--> 132 '
configure_optimizers
must include a monitor when aReduceLROnPlateau
scheduler is used.'133 ' For example: {"optimizer": optimizer, "lr_scheduler": scheduler, "monitor": "metric_to_track"}'
134 )
MisconfigurationException:
configure_optimizers
must include a monitor when aReduceLROnPlateau
scheduler is used. For example: {"optimizer": optimizer, "lr_scheduler": scheduler, "monitor": "metric_to_track"}**Environment
Beta Was this translation helpful? Give feedback.
All reactions