You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, you mentioned in your paper that "object queries into Q and the depth embeddings into K and V by linear layers" in your cross-attention, But I see in your open source code that object queries are entered directly into self.cross_attn_depth after nn.embeding. Neither object queries nor the depth embeddings go through a linear process layers, is there something wrong with me? Thank you for your time.
The text was updated successfully, but these errors were encountered:
Hello, you mentioned in your paper that "object queries into Q and the depth embeddings into K and V by linear layers" in your cross-attention, But I see in your open source code that object queries are entered directly into self.cross_attn_depth after nn.embeding. Neither object queries nor the depth embeddings go through a linear process layers, is there something wrong with me? Thank you for your time.
The text was updated successfully, but these errors were encountered: