-
Notifications
You must be signed in to change notification settings - Fork 6.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Warning: Detected call of lr_scheduler.step()
before optimizer.step()
#1100
Comments
I successfully locate the variable, and fixed it with simple conditional code. |
I replied to your PR. |
I also got the same error. I had cloned my repo in Jan 2021, so I don't understand why it shows up, since it should have been resolved by now. Has it been updated? |
Also still getting this warning, and I noticed that at least in the first 25 training epochs, the learning rate never updates (remains at 0.0002000). Though I noticed that the PR referenced above hasn't merged |
can you please provide a description on where and how to put lr_scheduler.step() AFTER optimizer.step()? many thanks |
I believe this warning is caused by the change of order regarding learning_rate update. As it will be better to inform you, I opened a issue.
This can be easily fixed by adding one conditional variable. However, before I go and try to find a solution for this issue, I want to make sure whether all training will start with epoch 1 or not. If not, I want to know where the starting epoch could be found.
The text was updated successfully, but these errors were encountered: