Skip to content

LRP with pretrained ResNet returns weird heatmaps #1035

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
razla opened this issue Sep 24, 2022 · 1 comment
Open

LRP with pretrained ResNet returns weird heatmaps #1035

razla opened this issue Sep 24, 2022 · 1 comment

Comments

@razla
Copy link

razla commented Sep 24, 2022

Hey,

I’m trying to use your LRP implementation on a pretrained model resnet34/32 (ImagetNet and CIFAR10/100 respectively).

I needed to convert the ReLU layers to inplace=False in order to make it work and then when I execute this - it works but the output is weird and inconsistent…

these are the images for example:

image

and these are the explanations (heat maps)

image

which doesnt make sense, since LRP tends to be MUCH more readable and coherent.

thanks!

@NarineK
Copy link
Contributor

NarineK commented Sep 30, 2022

@razla, are you getting most of the LRP scores zero or close to zero or are you visualizing only the positive scores ?
It can be that we need to look into the layer types and the rules applied on them. You might need to readjust used rules define custom ones.

There is a note from @johannesk, that we might need to tinker the rules a little bit for Resnet-18.

#668 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants