-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Optimizer wish list #9182
Comments
working on FTML |
|
AdaBound |
@szha I would like to attempt an implementation of the RAdam optimizer. I have written some code here that probably isn't yet be ready for a PR (might not even be correct), so how would you like me to share it with you for a review so that you can point me in the right direction? Thanks. |
@Hunter-Zolomon thanks for offering to contribute. To implement optimizer in MXNet, the best reference would be the existing optimizers. You can find many in Once you have the implementation ready, you can refer to the contribution guides and submit a pull request. Make sure to also include tests for the new optimizer following examples here. Feel free to ping me or others for review on the pull request. |
Optimizer plays a fundamental role in machine learning. This issue tracks the optimizers that people have requested for support in MXNet. We can also comment below to recommend new optimizers. It's inspired by the thread in https://discuss.gluon.ai/t/topic/3714.
Zheng & Kwok, Follow the Moving Leader in Deep Learning, ICML 2017
On the Convergence of Adam and Beyond, ICLR 2018 Submission
@szha @szhengac
The text was updated successfully, but these errors were encountered: