Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DummyLogger can be called with unknown methods #13224

Merged
Merged
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
-



- Added support for calling unknown methods with `DummyLogger` ([#13224](https://github.com/PyTorchLightning/pytorch-lightning/pull/13224))

### Changed

- Enable validation during overfitting ([#12527](https://github.com/PyTorchLightning/pytorch-lightning/pull/12527))
Expand Down
8 changes: 8 additions & 0 deletions src/pytorch_lightning/loggers/logger.py
Original file line number Diff line number Diff line change
Expand Up @@ -346,6 +346,14 @@ def __iter__(self):
# if DummyLogger is substituting a logger collection, pretend it is empty
yield from ()

def __getattr__(self, name: str) -> Callable:
"""Allows the DummyLogger to be called with arbitrary methods, to avoid AttributeErrors."""

def method(*args, **kwargs):
rohitgr7 marked this conversation as resolved.
Show resolved Hide resolved
return None

return method


def merge_dicts(
dicts: Sequence[Mapping],
Expand Down
5 changes: 4 additions & 1 deletion src/pytorch_lightning/trainer/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -589,7 +589,10 @@ def _init_debugging_flags(
self.check_val_every_n_epoch = 1
self.loggers = [DummyLogger()] if self.loggers else []

rank_zero_info(f"Running in fast_dev_run mode: will run the requested loop using {num_batches} batch(es).")
rank_zero_info(
f"Running in `fast_dev_run` mode: will run the requested loop using {num_batches} batch(es). "
"Logging and checkpointing is suppressed."
)

self.limit_train_batches = _determine_batch_limits(limit_train_batches, "limit_train_batches")
self.limit_val_batches = _determine_batch_limits(limit_val_batches, "limit_val_batches")
Expand Down
8 changes: 8 additions & 0 deletions tests/tests_pytorch/loggers/test_logger.py
Original file line number Diff line number Diff line change
Expand Up @@ -257,6 +257,14 @@ def test_dummylogger_noop_method_calls():
logger.log_metrics("1", 2, three="three")


def test_dummlogger_arbitrary_method_calls():
"""Test that the DummyLogger can be called with non existing methods."""
logger = DummyLogger()
# Example method from WandbLogger
assert hasattr(logger, "log_text")
assert callable(logger.log_text)


def test_dummyexperiment_support_item_assignment():
"""Test that the DummyExperiment supports item assignment."""
experiment = DummyExperiment()
Expand Down