-
Notifications
You must be signed in to change notification settings - Fork 246
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about learning rate #80
Comments
Hello @wwjwy , i'm currently also trying this code to run but i have some problems when running it and your post is the most recent i can find, i want to ask some issues with it, if you do not mind, can you give me a way to reach you ? any social media or platform would be okay. Thankyou. p.s.: sorry i could not provide any solution to your issues. |
my email is wwjhhu@gmail.com |
LR decay is all done in keras https://keras.io/api/optimizers/learning_rate_schedules/ |
In the paper, you said "The learning rate is set initially to 10−4 for both generator and critic. After the first 150 epochs we linearly decay the rate to zero over the next 150 epochs".
But i can not find relevant code for adjusting learning rate. Can you help me?
The text was updated successfully, but these errors were encountered: