Skip to content

Commit

Permalink
remove the logging of val loss at test time (#8551)
Browse files Browse the repository at this point in the history
* remove the logging of val loss at test time

Signed-off-by: HuiyingLi <willwin.lee@gmail.com>

* remove per dataloader test loss log

Signed-off-by: HuiyingLi <willwin.lee@gmail.com>

---------

Signed-off-by: HuiyingLi <willwin.lee@gmail.com>
  • Loading branch information
HuiyingLi authored Mar 1, 2024
1 parent d8ebaa5 commit 9b64e39
Showing 1 changed file with 2 additions and 4 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -488,11 +488,9 @@ def inference_epoch_end(self, outputs, mode, data_cfg):
# we can only log on one rank if it is rank zero so we broadcast from last rank
torch.distributed.broadcast(loss, get_last_rank())

self.log('val_loss', loss, prog_bar=True, rank_zero_only=True, batch_size=1)
if mode != 'test':
self.log('val_loss', loss, prog_bar=True, rank_zero_only=True, batch_size=1)

# Determine the key used to log the loss based on the user provided name of the dataset or the dataloader index.
loss_log_key = self._determine_log_key(data_cfg, dataloader_idx, "loss", mode)
self.log(loss_log_key, loss, batch_size=1)
averaged_loss.append(loss)

# Gather the outputs object from all data parallel ranks since we are using the DistributedSampler which splits data across DDP ranks.
Expand Down

0 comments on commit 9b64e39

Please sign in to comment.