Skip to content

Commit 44ef111

Browse files
committed
Remove empty line
1 parent 506f798 commit 44ef111

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

vllm/attention/layer.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -532,7 +532,6 @@ def unified_attention_with_output(
532532
# Not all layers can use RoPE fusing, so check that they were given all
533533
# needed inputs along with the environment variable to enable this.
534534
if (
535-
536535
and hasattr(self.impl, "rotary_emb")
537536
and self.impl.rotary_emb is not None
538537
and positions is not None

0 commit comments

Comments
 (0)