Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Addition of Mean Substraction Operator from "Revisiting “Over-smoothing” in Deep GCNs" #5056

Closed
Alvaro-Ciudad opened this issue Jul 26, 2022 · 6 comments · Fixed by #5068
Assignees
Labels

Comments

@Alvaro-Ciudad
Copy link
Contributor

Alvaro-Ciudad commented Jul 26, 2022

🚀 The feature, motivation and pitch

I have been working on graph autoencoders, and I had some problems with oversmoothing. I ve tried a few alternatives, and this is one of the best working ones.
I also believe that PyG could benefit of more alternative methods of tackling oversmoothing. The code of the layer is already done, so it would just be a simple pull request with a few tests.

from torch_scatter import scatter
class MeanSubstraction(torch.nn.Module):
    def __init__(self):
        super().__init__()

    def forward(self, x, batch = None):

        if batch is None:
            x = x - x.mean(dim=0, keepdim=True)
            return x
        else:
            mean = scatter(x, batch, dim=0, reduce='mean')
            x = x - mean.index_select(0, batch)
            return x

Alternatives

It could also be added as a flag inside of the PairNorm implementation, as it is an special case of this normalization.

Additional context

The paper in question: https://arxiv.org/pdf/2003.13663v1.pdf

@rusty1s
Copy link
Member

rusty1s commented Jul 27, 2022

Thanks for sharing. This looks super easy to integrate within the nn.norm package. Let me know if you want to work on this. @lightaime and @Padarn can help you further.

@Padarn Padarn self-assigned this Jul 27, 2022
@Padarn
Copy link
Contributor

Padarn commented Jul 27, 2022

Sure I'll pick it up!

@Alvaro-Ciudad
Copy link
Contributor Author

Great, if you pick it up works for me :)

@lightaime
Copy link
Contributor

Do we want to add it to the nn.aggr. It looks more like something we should add to nn.norm to me. It is a part of PairNorm as @Alvaro-Ciudad mentioned.

@rusty1s
Copy link
Member

rusty1s commented Jul 27, 2022

Oh, you are right. I totally misread.

@Padarn
Copy link
Contributor

Padarn commented Jul 28, 2022

I also misread this (sorry I was on my phone)... I made a PR here #5068, but happy for @Alvaro-Ciudad to take over and make any changes that make sense to you.

I added it to the nn.norm package, but added support for using any an aggregator from the new nn.aggr.

@rusty1s rusty1s linked a pull request Jul 28, 2022 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants