-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some problems about Code #19
Comments
Same question. The forward function in LayerScaleBlockClassAttn realized in timm is: |
Hi, thanks for pointing this out. There are indeed some difference between our implementation and CaiT. But the experiments in the paper are all using the method in the released code… |
Hi, thanks for pointing this out. There are indeed some difference between our implementation and CaiT. But the experiments in the paper are all using the method in the released code… |
FAN/models/fan.py
Lines 311 to 315 in ee1b7df
Hi! Thank you for your great work! According to CaiT, I think the code should be in the following form:
cls_token = x[:, 0:1] + self.drop_path(self.gamma2 * self.mlp(x[:, 0:1] ))
x = torch.cat([cls_token, x[:, 1:], dim=1)
The text was updated successfully, but these errors were encountered: