-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to implement spatial attention module? #1
Comments
Hi, thank you so much for your interest. |
Hi, so many thanks for your quick reply. As I cannot map the default dimension of scale factor of BN to spatial dims ( C != H X W ? ), so my real concern is how to implement the "pixel normalization" module. My idea is set "self.channels = HxW", and than the forward() func should implement as below: please help me figure out the right soultion. Many thanks. |
Hi, It seems this is the same as the paper said. |
Thank you so much. I will test it later. |
hi,do you test the spatial attention?does it work? |
Sorry, I have not. Maybe you can just paste my above code and run the train script for a result. |
Hi, did you test your spatial attention codes? Does it work well? |
您好 关于空间注意力您实现了吗? 我的想法是能不能先采用池化将H*W进行归一化 在计算BN |
Dear author
Do you have any plans to relase the code for spatial attention? If not, please give some ideas about how to implement.
Many thanks.
The text was updated successfully, but these errors were encountered: