forked from facebookresearch/multimodal
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Don't reuse nn.ReLU modules in CLIP ResNet (facebookresearch#33)
Summary: Reusing the same ReLU module for multiple layers can make it more difficult for researchers as for example PyTorch's hook system won't work properly on the reused layer modules. I ran into this issue while building and testing interpretability tools on the CLIP models. This PR doesn't change how any of the models work. It merely makes it possible to access and research each ReLU layer separately. Let me know if I need to make any changes before it can be merged! Pull Request resolved: facebookresearch#33 Reviewed By: ankitade Differential Revision: D36110555 Pulled By: ebsmothers fbshipit-source-id: 992ae5bb53dd1fe83e793f55cc7258cc06516a74
- Loading branch information
1 parent
292219e
commit d216331
Showing
1 changed file
with
12 additions
and
11 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters