Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to implement spatial attention module? #1

Closed
imyangs opened this issue Dec 2, 2021 · 8 comments
Closed

How to implement spatial attention module? #1

imyangs opened this issue Dec 2, 2021 · 8 comments

Comments

@imyangs
Copy link

imyangs commented Dec 2, 2021

Dear author

Do you have any plans to relase the code for spatial attention? If not, please give some ideas about how to implement.
Many thanks.

@Christian-lyc
Copy link
Owner

Hi, thank you so much for your interest.
The spatial part is same as the channel part. you basically need to reshape the H and W to 1 D. BN just measure the importance of each pixel.

@imyangs
Copy link
Author

imyangs commented Dec 3, 2021

Hi, so many thanks for your quick reply.

As I cannot map the default dimension of scale factor of BN to spatial dims ( C != H X W ? ), so my real concern is how to implement the "pixel normalization" module. My idea is set "self.channels = HxW", and than the forward() func should implement as below:

please help me figure out the right soultion. Many thanks.

image

@Christian-lyc
Copy link
Owner

Hi, It seems this is the same as the paper said.

@imyangs
Copy link
Author

imyangs commented Dec 3, 2021

Thank you so much. I will test it later.

@imyangs imyangs closed this as completed Dec 3, 2021
@Ronky123
Copy link

Ronky123 commented Dec 7, 2021

Thank you so much. I will test it later.

hi,do you test the spatial attention?does it work?

@imyangs
Copy link
Author

imyangs commented Dec 12, 2021

Thank you so much. I will test it later.

hi,do you test the spatial attention?does it work?

Sorry, I have not. Maybe you can just paste my above code and run the train script for a result.

@haikunzhang95
Copy link

Thank you so much. I will test it later.

Hi, did you test your spatial attention codes? Does it work well?
Thank you!

@23jisuper
Copy link

非常感谢。我稍后会测试它。

嗨,你测试空间注意力吗?它有效吗?

对不起,我没有。也许你可以粘贴我上面的代码并运行训练脚本以获得结果。

您好 关于空间注意力您实现了吗? 我的想法是能不能先采用池化将H*W进行归一化 在计算BN

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants