-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How optimizer frequencies works #3481
Comments
it's fixed on master #3229 |
@rohitgr7 |
|
|
@awaelchli I tried it and got this: Is it ok, or it's a bug? |
@7rick03ligh7
but, when you have some frequencies defined, each training_step will use only 1 optimizer depending upon the current training_step, and outputs from the other step with different optimizer won't be transferred. So you will have a list with the size equal to the number of training_steps. Your output is excepted so it's not a bug. |
@rohitgr7 yes! subtle difference there, but very important! this is indeed working as intended. |
@awaelchli yeah, but in the documentation, it is not clear that with frequencies the outputs will be in a single list (at least for me) |
Anyway, thanks for your replies) |
❓ Questions and Help
What is your question?
I don't understand how optimizer frequencies works
#1269
Code
When I tried to work with them as a list of optimizers, I got this error
It arises after training_step()
What's your environment?
P.S.
If you have a pytorch-lightning WGAN implementation or something with n_critics I would appreciate if you could share )
The text was updated successfully, but these errors were encountered: