Skip to content

Conversation

@jeejeelee
Copy link
Collaborator

@jeejeelee jeejeelee commented Aug 23, 2025

Purpose

When using the latest main branch of transformers, encounter the following import error

ERROR 08-23 02:59:11 [registry.py:430]   File "/root/Code/vllm_dev/vllm/vllm/attention/layers/encoder_only_attention.py", line 8, in <module>
ERROR 08-23 02:59:11 [registry.py:430]     from transformers import CacheConfig
ERROR 08-23 02:59:11 [registry.py:430] ImportError: cannot import name 'CacheConfig' from 'transformers' (/root/Code/vllm_dev/transformers/src/transformers/__init__.py)

In addition, based on the parameters of the Attention, CacheConfig should come from vllm.config.

Test Plan

Test Result

(Optional) Documentation Update


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.

Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
@jeejeelee jeejeelee requested a review from heheda12345 August 23, 2025 03:10
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly resolves an ImportError for CacheConfig in vllm/attention/layers/encoder_only_attention.py. The change updates the import path from transformers to vllm.config, which is the correct source for this class as used by the Attention superclass. The fix is accurate and necessary to resolve the described runtime error.

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) August 23, 2025 04:12
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Aug 23, 2025
@DarkLight1337 DarkLight1337 merged commit 65197a5 into vllm-project:main Aug 23, 2025
49 checks passed
@jeejeelee jeejeelee deleted the modify-cacheconfig-import branch August 23, 2025 12:19
@heheda12345
Copy link
Collaborator

Sorry for that... But quite strange that type checker failed to catch this problem.

epwalsh pushed a commit to epwalsh/vllm that referenced this pull request Aug 28, 2025
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
xiao-llm pushed a commit to xiao-llm/vllm that referenced this pull request Aug 28, 2025
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
Signed-off-by: Xiao Yu <xiao.yu@amd.com>
zhewenl pushed a commit to zhewenl/vllm that referenced this pull request Aug 28, 2025
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
mengxingkongzhouhan pushed a commit to mengxingkongzhouhan/vllm that referenced this pull request Aug 30, 2025
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
zhewenl pushed a commit to zhewenl/vllm that referenced this pull request Sep 3, 2025
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
ekagra-ranjan pushed a commit to ekagra-ranjan/vllm that referenced this pull request Sep 4, 2025
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
Signed-off-by: Ekagra Ranjan <3116519+ekagra-ranjan@users.noreply.github.com>
FeiDaLI pushed a commit to FeiDaLI/vllm that referenced this pull request Sep 25, 2025
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants