You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I’m trying to use your LRP implementation on a pretrained model resnet34/32 (ImagetNet and CIFAR10/100 respectively).
I needed to convert the ReLU layers to inplace=False in order to make it work and then when I execute this - it works but the output is weird and inconsistent…
these are the images for example:
and these are the explanations (heat maps)
which doesnt make sense, since LRP tends to be MUCH more readable and coherent.
thanks!
The text was updated successfully, but these errors were encountered:
@razla, are you getting most of the LRP scores zero or close to zero or are you visualizing only the positive scores ?
It can be that we need to look into the layer types and the rules applied on them. You might need to readjust used rules define custom ones.
There is a note from @johannesk, that we might need to tinker the rules a little bit for Resnet-18.
Hey,
I’m trying to use your LRP implementation on a pretrained model resnet34/32 (ImagetNet and CIFAR10/100 respectively).
I needed to convert the ReLU layers to inplace=False in order to make it work and then when I execute this - it works but the output is weird and inconsistent…
these are the images for example:
and these are the explanations (heat maps)
which doesnt make sense, since LRP tends to be MUCH more readable and coherent.
thanks!
The text was updated successfully, but these errors were encountered: