Skip to content

Commit eabca2e

Browse files
DarkLight1337xuebwang-amd
authored andcommitted
[Bugfix] Fix incorrect import of CacheConfig (vllm-project#24631)
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk> Signed-off-by: xuebwang-amd <xuebwang@amd.com>
1 parent 3483a86 commit eabca2e

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

vllm/attention/layers/cross_attention.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,14 +6,13 @@
66

77
import numpy as np
88
import torch
9-
from transformers import CacheConfig
109

1110
from vllm import envs
1211
from vllm.attention.backends.abstract import (AttentionBackend,
1312
AttentionMetadata, AttentionType)
1413
from vllm.attention.layer import Attention
1514
from vllm.attention.selector import get_attn_backend
16-
from vllm.config import VllmConfig
15+
from vllm.config import CacheConfig, VllmConfig
1716
from vllm.logger import init_logger
1817
from vllm.multimodal import MULTIMODAL_REGISTRY
1918
from vllm.utils import cdiv

0 commit comments

Comments
 (0)