-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Errors occur in LrUpdaterHook when multiple optimizers are introduced #887
Comments
please show the optimizer which you define. |
Thanks for your prompt reply, your guess is correct.
BTW, self.regular_lr is |
Should mmcv/mmcv/runner/hooks/momentum_updater.py Lines 44 to 48 in 371a217
mmcv/mmcv/runner/hooks/lr_updater.py Lines 69 to 81 in 371a217
|
Hello Zhou, To my best knowledge, I think the MomentumHook should set different momentums for different optimizers. ( Now I'm using Adam optimizer for my project which incorporates the momentum and I think surely different Adam optimizers should include different momentums) BTW, I simply modified the LrUpdateHook by:
So far it works well for me. |
Thanks for your feedback, we will support what you suggest in the next version. |
Related PR #907 |
Hi,
I'm using mmpose with multiple optimizers and I got errors in LrUpdateHook
It seems to me it is because get_warmup_lr does not work for multiple optimizers as self.regular_lr is a list for single optimizer while it is a dict for multiple optimizers.
mmcv/mmcv/runner/hooks/lr_updater.py
Lines 83 to 92 in 73bff4e
mmcv/mmcv/runner/hooks/lr_updater.py
Lines 69 to 81 in 73bff4e
The text was updated successfully, but these errors were encountered: