Skip to content

Commit 87efc68

Browse files
authored
llama4_vision_rope: add HIP override to accept (q, k) and avoid (positions, q, k) mismatch (#26790)
Signed-off-by: Huamin Li <3ericli@gmail.com>
1 parent c3a722f commit 87efc68

File tree

1 file changed

+7
-0
lines changed

1 file changed

+7
-0
lines changed

vllm/model_executor/layers/rotary_embedding/llama4_vision_rope.py

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -78,3 +78,10 @@ def forward_cuda( # type: ignore[override]
7878
key: torch.Tensor | None = None,
7979
) -> tuple[torch.Tensor, torch.Tensor | None]:
8080
return self.forward_native(query, key)
81+
82+
def forward_hip( # type: ignore[override]
83+
self,
84+
query: torch.Tensor,
85+
key: torch.Tensor | None = None,
86+
) -> tuple[torch.Tensor, torch.Tensor | None]:
87+
return self.forward_native(query, key)

0 commit comments

Comments
 (0)