Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow setting betas parameters for LAMB optimizer #111

Open
jimypbr opened this issue Jun 22, 2022 · 0 comments
Open

Allow setting betas parameters for LAMB optimizer #111

jimypbr opened this issue Jun 22, 2022 · 0 comments
Labels
enhancement New feature or request

Comments

@jimypbr
Copy link
Contributor

jimypbr commented Jun 22, 2022

[From AlexC in GC]
Currently it is only possible to set betas for ADAMW optimizer. We would like to do it also for LAMB.

  • Rename training argument from adam_beta1 and adam_beta2 to optimizer_beta1 and optimizer_beta2
  • In IPUTrainer.create_optimizer pass the value of the above parameters to betas as part of optimizer_kwargs

Bonus point: while you are at it you could rename the parameter adam_epsilon to optimizer_epsilon, since this value is also used by LAMB.

@jimypbr jimypbr added the enhancement New feature or request label Jun 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant