Closed
Description
🐛 Bug
Issue when running: fast_dev_run=True
"TypeError: log_hyperparams() takes 2 positional arguments but 3 were given"
To Reproduce
When using the following: Where self.hp_metrics is a list of strings where each string is an available metric that is being logged, example "accuracy/val".
def on_train_start(self):
if self.logger:
self.logger.log_hyperparams(self.hparams, {metric:0 for metric in self.hp_metrics})
Expected behavior
Assume the unit test is wrong since the documentation say that self.logger.log_hyperparams takes one positional argument and one dictionary. The code run fine without fast_dev_run=True and everything is logged correctly to tensorboard.
Environment
pytorch_lightning 1.2.2