-
Notifications
You must be signed in to change notification settings - Fork 512
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multi Layer Support #456
Multi Layer Support #456
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good! Thank you for working on this PR.
Minor comments related to torch.nn.Sequential
and documentation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@vivekmig has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@vivekmig has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Summary: This adds support to obtain attributions for multiple layer for LayerActivation and LayerGradientXActivation with corresponding changes to helper methods to support multiple layers. Also adds appropriate DataParallel tests and updates documentation. Pull Request resolved: pytorch#456 Reviewed By: NarineK Differential Revision: D23741638 Pulled By: vivekmig fbshipit-source-id: 2b6a4d01de8a1ddc838482637b2bcef0a0f62515
This adds support to obtain attributions for multiple layer for LayerActivation and LayerGradientXActivation with corresponding changes to helper methods to support multiple layers. Also adds appropriate DataParallel tests and updates documentation.