-
Notifications
You must be signed in to change notification settings - Fork 74.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tf.keras.optimizers.experimental.AdamW only support constant weight_decay #55824
Comments
@x10000year, Thanks for proposing new feature request in TF. |
As above code shows, tfa.optimizers.AdamW allows us to specify the schedule of weight decay, which should be in proportional with learning rate schedule. However, I cannot do the same thing with tf.keras.optimizers.experimental.AdamW. |
Thanks for opening this issue. Development of keras moved to separate repository https://github.com/keras-team/keras/issues Please post this issue on keras-team/keras repo. |
This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you. |
Closing as stale. Please reopen if you'd like to work on this further. |
Click to expand!
Issue Type
Feature Request
Source
source
Tensorflow Version
2.8
Custom Code
No
OS Platform and Distribution
No response
Mobile device
No response
Python version
No response
Bazel version
No response
GCC/Compiler version
No response
CUDA/cuDNN version
No response
GPU model and memory
No response
Current Behaviour?
Standalone code to reproduce the issue
Relevant log output
No response
The text was updated successfully, but these errors were encountered: