You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your interest in our work.
Your understanding of the code is correct. That is, in TPVFormer04, cross-view hybrid attention is enabled only in the HW plane, thus degrading to self-attention.
Thanks for sharing the great work.
Regarding to Cross-view Hybrid attention, is it only apllied for the HW top plane?
TPVFormer/tpvformer04/modules/tpvformer_layer.py
Line 172 in 2073589
The query is itself, key and value are both none while later in cross-view hybrid attention the value is set to be the concatenation of queries
TPVFormer/tpvformer04/modules/cross_view_hybrid_attention.py
Line 163 in 2073589
The text was updated successfully, but these errors were encountered: