-
Notifications
You must be signed in to change notification settings - Fork 203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add end_scale argument #975
Add end_scale argument #975
Conversation
Thanks @stefanocortinovis for the contribution! The code looks good to me but I'm not so sure about the name. What would you think about |
Thanks for the quick feedback, @fabianp! I named it |
ah, I see now. Thanks for the response, makes sense |
the reason I'm not convinced with Anyway, I also couldn't come up with a better name, so let's give this 24h if merge it if nobody comes up with a better suggestion |
Thanks @stefanocortinovis !
|
Thanks for the suggestion, @vroulet! Yeah, I was also wondering that. Personally, I have only seen In principle, I agree that |
Then |
+1, I like this |
Sounds good! I made the change from |
This PR adds an
end_scale
tocontrib.reduce_on_plateau()
:min_lr
intorch.optim.ReduceLROnPlateau
by allowing the user to specify a threshold for the scale at which to stop the learning rate decay.factor < 1.0
(in which caseend_scale
is treated as a lower bound) andfactor > 1.0
(in which case `end_scale is treated as an upper bound).end_value
inoptax.exponential_decay
.