Skip to content
This repository has been archived by the owner on Feb 12, 2022. It is now read-only.

Loading a pre-trained model with Weight Drop #99

Closed
dhananjaisharma10 opened this issue May 8, 2019 · 1 comment
Closed

Loading a pre-trained model with Weight Drop #99

dhananjaisharma10 opened this issue May 8, 2019 · 1 comment

Comments

@dhananjaisharma10
Copy link

dhananjaisharma10 commented May 8, 2019

Hi!

I trained a sequence-to-sequence model using Weight Drop and then tried using it for inference. But I got the following error:

RuntimeError: Error(s) in loading state_dict for Seq2Seq:
        Unexpected key(s) in state_dict: "encoder.lstm2.module.weight_hh_l0", "encoder.lstm2.module.weight_hh_l0_reverse", "encoder.lstm3.module.weight_hh_l0", "encoder.lstm3.module.weight_hh_l0_reverse", "encoder.lstm4.module.weight_hh_l0", "encoder.lstm4.module.weight_hh_l0_reverse".

I presume this is because of the fact that those keys are by default not a part of my model. I am using model.load_state_dict(torch.load('PATH'), strict=False) in order to load the weights, but I'm unsure as to whether the strict=False part is correct or not.

Please let me know if I'm erring somewhere, or in case you need any further information.
Thanks!

@dhananjaisharma10
Copy link
Author

Solved! Refer to Issue #86 and use the hack-code by @sdraper-CS

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant