Skip to content

Commit 65197a5

Browse files
authored
[Misc] Modify CacheConfig import (#23459)
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
1 parent b8f17f5 commit 65197a5

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

vllm/attention/layers/encoder_only_attention.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,13 +5,13 @@
55
from typing import Optional
66

77
import torch
8-
from transformers import CacheConfig
98

109
from vllm import envs
1110
from vllm.attention.backends.abstract import (AttentionBackend,
1211
AttentionMetadata, AttentionType)
1312
from vllm.attention.layer import Attention
1413
from vllm.attention.selector import get_attn_backend
14+
from vllm.config import CacheConfig
1515
from vllm.v1.attention.backends.utils import (CommonAttentionMetadata,
1616
subclass_attention_backend)
1717

0 commit comments

Comments
 (0)