Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About Local Attention Module(LAM) and the code #11

Open
Urian-oy opened this issue Oct 15, 2024 · 1 comment
Open

About Local Attention Module(LAM) and the code #11

Urian-oy opened this issue Oct 15, 2024 · 1 comment

Comments

@Urian-oy
Copy link

Hi Bro,it's me again.
I read your paper again and plan to share your idea, but a little confuse.

I find that LAM don't match the code.The structure of the LAM module in the paper show the low-frequency portion of the image feature is not multiplied by the weight.But I found that the low-frequency part of your code is multiplied by the weight parameter.I show you your code and the LAM picture.

image
image
image

@c-yn
Copy link
Owner

c-yn commented Oct 15, 2024

Hi, the multiplication parameters are implemented in the previous line of the arrow.

The lamb_l/h are parameters to fuse results from two branches.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants