Skip to content
This repository has been archived by the owner on Dec 19, 2023. It is now read-only.

What is the intent of Layer SpecialSpmmFunctionFinal #27

Open
hmthanh opened this issue Jul 24, 2020 · 2 comments
Open

What is the intent of Layer SpecialSpmmFunctionFinal #27

hmthanh opened this issue Jul 24, 2020 · 2 comments

Comments

@hmthanh
Copy link

hmthanh commented Jul 24, 2020

Hi,
I just wonder why your model have to go throw layer SpecialSpmmFunctionFinal and what is the intent of this layer
The layer forward is :

class SpecialSpmmFunctionFinal(torch.autograd.Function):
    """Special function for only sparse region backpropataion layer."""
    @staticmethod
    def forward(ctx, edge, edge_w, N, E, out_features):
        # assert indices.requires_grad == False
        a = torch.sparse_coo_tensor(
            edge, edge_w, torch.Size([N, N, out_features]))
        b = torch.sparse.sum(a, dim=1)
        ctx.N = b.shape[0]
        ctx.outfeat = b.shape[1]
        ctx.E = E
        ctx.indices = a._indices()[0, :]

        return b.to_dense()

I debuging on your default parameter with WN18k dataset. In first epoch or fisrt batch of dataset, I have this shape :

  • Input :
    N : 40943 : number of entity of dataset
    E : 294211 : number of concat of head, tail and 2hop_head, tail
    edge : (2, 294211) : is present for <head_id, tail_id, and 2_hop_head_id, 2_hop_tail_id>
    edge_w : (294211, 1) : is present for weighed of training in GAT layer

  • Output :
    e_rowsum : (40943, 1) is present for : .... ??? .....

As I now, It just sum all feature of training entity into a vector embed, but I don't know why your model have to go throw this layer, Can you explain the intent of Layer SpecialSpmmFunctionFinal ?
Thanks @deepakn97

@lipingcoding
Copy link

same puzzle for me

@zhangxin9988
Copy link

I think SpecialSpmmFunctionFinal 's forward section is intend to compute the row sum of sparse matrix ,and backward return the gradient for the sparse matrix's values,but I find torch.sparse might solve the backward of row sum operation,for example:
`i = torch.LongTensor([[0, 1, 1],[2, 0, 2]]) #row, col
v = torch.FloatTensor([3, 4, 5]) #data
v.requires_grad=True
m=torch.sparse_coo_tensor(i, v, torch.Size([2,3])) #torch.Size()
m.retain_grad()

m1=torch.sparse.sum(m,dim=1)
m1.retain_grad()

m2=torch.sparse.sum(m1)
m2.backward()
print(v.grad)#v的梯度是tensor([1., 1., 1.])`

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants