You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
According to Eq. 4 in the paper, I have the impression that it should be torch.einsum('bgcj,cij->bgij', k, k_embedding) since p is the varying index. Please correct me if I am wrong. Thanks!
The text was updated successfully, but these errors were encountered:
This depends on the varying axis of the embedding you chooce, due to the two axis of the embedding here are two different directions, but both relative.
Thanks for the great work!
I am a bit confused about this piece of code:
axial-deeplab/lib/models/axialnet.py
Line 67 in fe1d052
According to Eq. 4 in the paper, I have the impression that it should be torch.einsum('bgcj,cij->bgij', k, k_embedding) since p is the varying index. Please correct me if I am wrong. Thanks!
The text was updated successfully, but these errors were encountered: