We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 86e6958 commit eefab69Copy full SHA for eefab69
captum/optim/_utils/circuits.py
@@ -112,7 +112,7 @@ def forward(self, x: torch.Tensor) -> torch.Tensor:
112
max2avg_pool2d(child)
113
114
115
-def ignore_layer(model, layer: torch.nn.Module) -> None:
+def ignore_layer(model, layer) -> None:
116
"""
117
Replace target layers with layers that do nothing.
118
This is useful for removing the nonlinear ReLU
0 commit comments