Skip to content

Commit 6aeb1da

Browse files
[Bugfix] Fix incorrect import of CacheConfig (#24631)
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
1 parent e93f4cc commit 6aeb1da

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

vllm/attention/layers/cross_attention.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,14 +6,13 @@
66

77
import numpy as np
88
import torch
9-
from transformers import CacheConfig
109

1110
from vllm import envs
1211
from vllm.attention.backends.abstract import (AttentionBackend,
1312
AttentionMetadata, AttentionType)
1413
from vllm.attention.layer import Attention
1514
from vllm.attention.selector import get_attn_backend
16-
from vllm.config import VllmConfig
15+
from vllm.config import CacheConfig, VllmConfig
1716
from vllm.logger import init_logger
1817
from vllm.multimodal import MULTIMODAL_REGISTRY
1918
from vllm.utils import cdiv

0 commit comments

Comments
 (0)