-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement Rmsprop optimiser #8
Conversation
TorchOpt/_src/transform.py
Outdated
del params | ||
nu = _update_moment_per_elem_norm(updates, state.nu, decay, 2, inplace) | ||
if inplace: | ||
def f(g, n): return g.mul_(torch.rsqrt_(n.add_(eps))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
应该不太对,n.add_会把n给inplace修改掉
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里应该只希望修改g,不希望修改n
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
修改了这里,同理还修正了其他会修改n的地方。另外修了一个test脚本的bug。
def f(g, n): return g.mul(torch.rsqrt(n.add(eps))) | ||
# """The followings are pytorch style""" | ||
# if inplace: | ||
# def f(g, n): return g.div_(torch.sqrt_(n).add_(eps)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
建议把这个也修改一下,似乎应该是torch.sqrt(n).add_
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
358行吗?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
是的
是不是应该RMSprop -> RMSProp涅? |
TF/PyTorch/Keras是RMSprop,Optax是RMSProp。那就按照Optax来吧 |
我看挺多论文都叫RMSProp |
已提交修改 |
No description provided.