Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix lightning hparams conflict #205

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

AntonioMirarchi
Copy link
Contributor

DO NOT MERGE YET!

In PyTorch Lightning, the LightningModule and LightningDataModule are two separate components with distinct responsibilities. Occasionally, conflicts in hyperparameters may arise between the model and the data loading process, causing issues during training or inference.
For example, in the case that I'm reporting the error comes out when the test is going to be performed because the best_model.ckpt was saved before than I resumed a training using load_model and from this the keys mismatch. One approach to fix it, is to update the data.hparams using the model.hparams. (change proposed by this PR).

image

It's also true that maybe should not be allowed to have a lot of difference in the hparams between the data and the model but just some keys like load_model; if this is the case maybe the best solution is to write a function like:

def fix_lightning_conflict(model, data):
    """This function inspects the hyperparameters of the model and data and fix any conflicts by providing a consistent 
        set of hyperparameters for both components. It ensures that key hyperparameters such as batch size, learning rate, etc., 
        are coherent between the model and data loading only if the hparms key is in the allowed_conflicts list"""

    allowed_conflicts = ["load_model"]
    model_hparams = model.hparams
    data_hparams = data.hparams

    for parm in model_hparams:
        if parm in allowed_conflicts and model_hparams[parm] != data_hparams[parm]:
            rank_zero_warn(
                f"Conflict between model and dataset hparams for {parm}! "
                f"Using {model_hparams[parm]} from the model."
            )
            data_hparams[parm] = model_hparams[parm]

    return data_hparams

@AntonioMirarchi
Copy link
Contributor Author

I'm not sure if there's a more efficient way to improve this implementation, if you have it just propose it. Moreover, if you believe that addressing this error, which arises occasionally, is not worth the effort, we can consider closing this pull request (PR).

@RaulPPelaez
Copy link
Collaborator

RaulPPelaez commented Jul 19, 2023

I am of the opinion that if the only difference in hparams is load_model (why is that an "hparam" anyway) then its ok.
Maybe you can simply make:
data.hparams.load_model = model.hparams.load_model
Or similarly for a list of allowed discrepancies.

@AntonioMirarchi
Copy link
Contributor Author

AntonioMirarchi commented Jul 19, 2023

It works like a dict so you can do: data.hparams['load_model']= model.hparams['load_model']

@AntonioMirarchi
Copy link
Contributor Author

It's possible to use the hparams_file argument in LNNP.load_from_checkpoint() to directly specify the path to the YAML file containing the hyperparameters used for training. This ensures that the model (used to test) has the same hyperparameters as the DataModule. However, it's important to note that this approach updates all hyperparameters without distinguishing between hyperparameters and parameters. Alternatively, it's possible to modify the save_hyperparameters called in the DataModule to focus on specific parameters rather than including all the arguments from the input YAML file.

I'm actually training with the version present on this PR.

Copy link
Collaborator

@RaulPPelaez RaulPPelaez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not see any problems with this approach. As far as I understand the input.yaml file is guaranteed to exist and contain the desired parameters.
cc @raimis @stefdoerr Do you see anything dangerous here?

@stefdoerr
Copy link
Collaborator

I feel like it's safer to just update those two parameters than to load the whole config in.

@AntonioMirarchi
Copy link
Contributor Author

The dtype arg is not a problem anymore, we fixed it in #208. The original idea was to update just some parameters, essentially the ones involved in the no-matching error. The problem is that the collision check on hparams between LightningModule and DataModule (check this for more) works with the hparams_initial and I was not able to modify this dict in any way (AFAIK it works differently from the hparams.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants