-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compare the gradient consistency between GPU and CPU calculations #3476
Conversation
for name in numeric_grads: | ||
b = numpy.array(scope.find_var(grad_var_name(name)).get_tensor()) | ||
a = numeric_grads[name] | ||
def get_grad(self, forward_op, backward_op, input_vars, grad_names, place): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rename to __get_gradient
?
private method should name with __xxx
https://google.github.io/styleguide/pyguide.html?showone=Naming#Naming
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done. use __get_gradient
.
out_names = [item for k in outputs for item in outputs[k]] | ||
|
||
# create input var and set value | ||
for name, value in input_vars.iteritems(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These input_vars
is numpy array but not paddle's Variable, is it better to name it input_python_vars
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rename input_value
, input_python_vars
is a little long.
] | ||
return outs | ||
|
||
def compare_grad(self, forward_op, inputs): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same as above __compare_gradient
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not a private method. The user can call this method to compare the gradients for the backward op.
for name in numeric_grads: | ||
b = numpy.array(scope.find_var(grad_var_name(name)).get_tensor()) | ||
a = numeric_grads[name] | ||
def get_grad(self, forward_op, backward_op, input_vars, grad_names, place): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add some comment to these methods
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
Fix #3478
gradient_checker.py
.And refine
gradient_checker.py
.gradient_checker.py
intotest_gradient_checker.py
.