Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Support of AdamW #140

Open
ridiculouz opened this issue May 6, 2023 · 0 comments
Open

Support of AdamW #140

ridiculouz opened this issue May 6, 2023 · 0 comments

Comments

@ridiculouz
Copy link

Hi there,
I notice that some optimizer in torch.optim is currently not automatically supported by higher, e.g. AdamW, NAdam, etc.
Any plan to add them into high.DiffrentiableOptimizer?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant