-
Notifications
You must be signed in to change notification settings - Fork 512
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix require grad warning for non-leaf tensor in noise tunnel #426
Conversation
) | ||
grad_enabled = True if gradient_neuron_index is not None or grad_enabled else False | ||
|
||
with torch.autograd.set_grad_enabled(grad_enabled): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another option is to leave here set_grad_enabled(True)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good! We don't need to enable grad if unnecessary.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, thanks for fixing this :) !
One question related to #421 , for NoiseTunnel with Saliency (or other methods that compute gradients directly wrt the provided input), the user will now always get this warning, since the input to the method will not require grad?
UserWarning: Input Tensor 0 did not already require gradients, required_grads has been set automatically.
warnings.warn(
To avoid this, we can internally require grad on inputs_with_noise for a gradient based method. What do you think?
) | ||
grad_enabled = True if gradient_neuron_index is not None or grad_enabled else False | ||
|
||
with torch.autograd.set_grad_enabled(grad_enabled): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good! We don't need to enable grad if unnecessary.
Yeah, that's a good point! Added require_grads for GradientAttribution methods. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@NarineK has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
…#426) Summary: This will fix the warning error specifically related to NoiseTunnel in pytorch#421. In addition to that I moved almost everything under no_grad in the attribute method. This will hopefully also help with runtime performance. In the `_forward_layer_eval ` I had to add `grad_enabled ` flag in order to allow to enable the gradients externally. As it is also needed in `test_neuron_gradient.py` test case. Pull Request resolved: pytorch#426 Reviewed By: vivekmig Differential Revision: D22500566 Pulled By: NarineK fbshipit-source-id: d3170e1711012593ff421b964a02e54532a95b13
This will fix the warning error specifically related to NoiseTunnel in #421.
In addition to that I moved almost everything under no_grad in the attribute method. This will hopefully also help with runtime performance.
In the
_forward_layer_eval
I had to addgrad_enabled
flag in order to allow to enable the gradients externally. As it is also needed intest_neuron_gradient.py
test case.