Skip to content

Commit c77cefd

Browse files
committed
fix ruff check error
1 parent 7e0ac3b commit c77cefd

File tree

1 file changed

+0
-2
lines changed

1 file changed

+0
-2
lines changed

src/transformers/integrations/npu_flash_attention.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,6 @@
2323

2424
import torch_npu
2525
from einops import rearrange, repeat
26-
from torch_npu import npu_rotary_mul
2726

2827

2928
# FlashAttention2 is supported on Ascend NPU with down-right aligned causal mask by default.
@@ -251,4 +250,3 @@ def npu_flash_attn_varlen_func(
251250
)[0]
252251

253252
return output
254-

0 commit comments

Comments
 (0)