Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: help smooth out losses' value on resume #473

Merged
merged 2 commits into from
Jun 19, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 22 additions & 0 deletions everyvoice/base_cli/callback.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
from typing import Any, Dict

Check warning on line 1 in everyvoice/base_cli/callback.py

View check run for this annotation

Codecov / codecov/patch

everyvoice/base_cli/callback.py#L1

Added line #L1 was not covered by tests

import pytorch_lightning as pl
from pytorch_lightning.callbacks import Callback
from typing_extensions import override

Check warning on line 5 in everyvoice/base_cli/callback.py

View check run for this annotation

Codecov / codecov/patch

everyvoice/base_cli/callback.py#L3-L5

Added lines #L3 - L5 were not covered by tests


class ResetValidationDataloaderCallback(Callback):

Check warning on line 8 in everyvoice/base_cli/callback.py

View check run for this annotation

Codecov / codecov/patch

everyvoice/base_cli/callback.py#L8

Added line #L8 was not covered by tests
"""
Reset the validation progress to allow resuming and validating a full
validation set and not just the first example in the validation set.
"""

@override
def on_save_checkpoint(

Check warning on line 15 in everyvoice/base_cli/callback.py

View check run for this annotation

Codecov / codecov/patch

everyvoice/base_cli/callback.py#L14-L15

Added lines #L14 - L15 were not covered by tests
self,
trainer: "pl.Trainer",
pl_module: "pl.LightningModule",
checkpoint: Dict[str, Any],
) -> None:
batch_progress = trainer.fit_loop.epoch_loop.val_loop.batch_progress
batch_progress.reset()

Check warning on line 22 in everyvoice/base_cli/callback.py

View check run for this annotation

Codecov / codecov/patch

everyvoice/base_cli/callback.py#L21-L22

Added lines #L21 - L22 were not covered by tests
10 changes: 9 additions & 1 deletion everyvoice/base_cli/helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -168,6 +168,8 @@
gradient_clip_val: float | None,
model_kwargs={},
):
from everyvoice.base_cli.callback import ResetValidationDataloaderCallback

Check warning on line 171 in everyvoice/base_cli/helpers.py

View check run for this annotation

Codecov / codecov/patch

everyvoice/base_cli/helpers.py#L171

Added line #L171 was not covered by tests

config = load_config_base_command(model_config, config_args, config_file)

save_configuration_to_log_dir(config)
Expand Down Expand Up @@ -197,6 +199,7 @@
**{"sub_dir": config.training.logger.sub_dir},
}
)

lr_monitor = LearningRateMonitor(logging_interval="step")
logger.info("Starting training.")
# This callback will always save the last checkpoint
Expand Down Expand Up @@ -226,7 +229,12 @@
max_steps=config.training.max_steps,
check_val_every_n_epoch=config.training.check_val_every_n_epoch,
val_check_interval=config.training.val_check_interval,
callbacks=[monitored_ckpt_callback, last_ckpt_callback, lr_monitor],
callbacks=[
monitored_ckpt_callback,
last_ckpt_callback,
lr_monitor,
ResetValidationDataloaderCallback(),
],
strategy=strategy,
num_nodes=nodes,
detect_anomaly=False, # used for debugging, but triples training time
Expand Down