-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
the init parameter "optimizer" of Trainer() should be a function #11157
Comments
jacquesqiao
changed the title
the init parameter optimizer of Trainer() should be a function
the init parameter "optimizer" of Trainer() should be a function
Jun 4, 2018
when we init Optimizer with the code of Fluid, the
|
@jacquesqiao @seiriosPlus Does the bug only happen with the learning rate decay? I wonder if there is a way to create a copy of the optimizer with the proper. |
@jacquesqiao : I think your suggestion sounds pretty good. I will make a PR implementing your approach. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Background
@seiriosPlus find a bug in the new Trainer high-level API when writing a model with learning rate decay. The
optimizer
is created in a different program than the trainer use.Reason and solution
Paddle/python/paddle/fluid/trainer.py
Lines 88 to 103 in 9f0dcfd
In the design of Trainer API, train_program is a function call, it will be called inside Trainer under a with scope
Paddle/python/paddle/fluid/trainer.py
Lines 115 to 117 in 9f0dcfd
But the init parameter
optimizer
is an object, it will be created outside thewith scope
, this is not right, so we should makeoptimizer
also a function call that return an optimizer bojectthe interface should be
The text was updated successfully, but these errors were encountered: