-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement details of Sobel Layer? #4
Comments
As the sobel conv is the fundamental operation of your ESB, I'm waiting for your response. Thanks in advance. |
Almost there, except for the sequence of l2 norm and batchnorm2d, which is corrected in the lately version (see https://arxiv.org/abs/2104.06832). Our implement sobel together with rest module are still under inner review. |
Sorry, but I still have a question. What's the meaning of the L2 norm? As your latest version show that the l2norm is append behind the BN layer. But the output of sobel layer and bn is B, 1, H, W. |
In fact we apply L2 norm as the fusion function of x and y, thus the channel dimension to be 1: |
hello there! I trained my model these days, and could you plz roughly tell me the final loss on CASIA_v2 ? I wanna know whether it is converged. |
Thanks for your work.
As the paper show, it consists four sublayer. But I wonder how the sobel result of x and y be fusioned together?
DOES THE CODE BELOW RIGHT?
The text was updated successfully, but these errors were encountered: