Skip to content

Conversation

@ahengljh
Copy link
Contributor

@ahengljh ahengljh commented Aug 26, 2025

Purpose

Add deprecation warning for lora_extra_vocab_size parameter as suggested in PR #23540. This is a gentler approach than immediate removal, giving users time to adapt their code.

Related to: #23474

Test Plan

Test that the deprecation warning appears when using LoRA with non-zero lora_extra_vocab_size:

from vllm.config import LoRAConfig
config = LoRAConfig()
config = LoRAConfig(lora_extra_vocab_size=256)

Test Result

>>> from vllm.config import LoRAConfig
INFO 08-26 16:37:46 [__init__.py:241] Automatically detected platform cuda.
>>> config = LoRAConfig()
WARNING 08-26 16:38:01 [__init__.py:2502] `lora_extra_vocab_size` is deprecated and will be removed in a future release. Additional vocabulary support for LoRA adapters is being phased out.

Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds a deprecation warning for the lora_extra_vocab_size parameter in LoRAConfig. While this is a good step towards phasing out the feature, I've found a critical issue where the existing validation logic contradicts the deprecation. Users are unable to set lora_extra_vocab_size to 0 to disable the feature without causing a ValueError. My review includes a detailed comment on how to address this to ensure a smooth deprecation path for users.

@jeejeelee
Copy link
Collaborator

Please fix the pre-cmmit failure

@ahengljh ahengljh force-pushed the deprecate-lora-extra-vocab branch from 7b20c21 to 5aa1e64 Compare August 27, 2025 02:49
@ahengljh
Copy link
Contributor Author

ahengljh commented Aug 27, 2025

Please fix the pre-cmmit failure

Done. It got passed. But other pipelines seem failed because of GPU occupied. Maybe they should be re-triggered.

@hmellor
Copy link
Member

hmellor commented Aug 27, 2025

fastcheck is not necessary to merge a PR, I'll enable the main tests to see if this broke anything

@hmellor hmellor added the ready ONLY add when PR is ready to merge/full CI is needed label Aug 27, 2025
Copy link
Member

@hmellor hmellor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like a bunch of LoRA tests are now failing. Could you fix the tests to work with the new default?

Signed-off-by: Jinheng Li <ahengljh@gmail.com>
Signed-off-by: Jinheng Li <ahengljh@gmail.com>
@ahengljh ahengljh force-pushed the deprecate-lora-extra-vocab branch from df6dda6 to baf7e52 Compare August 28, 2025 02:29
@ahengljh
Copy link
Contributor Author

Looks like a bunch of LoRA tests are now failing. Could you fix the tests to work with the new default?

I believe it should be ok now, I remove some changes and keep warning message only now.

@ahengljh
Copy link
Contributor Author

Looks like a bunch of LoRA tests are now failing. Could you fix the tests to work with the new default?

I believe it should be ok now, I remove some changes and keep warning message only now.

Somehow my local pre-commit didn't find the format error as the pipeline did. I will fix it manually. @jeejeelee @hmellor

The previous commit changed the default value from 256 to 0, which
caused test_v1_llm_by_default to fail. This commit keeps the original
default value of 256 while still adding the deprecation warning to
notify users that this feature will be removed in v0.12.0.

The deprecation warning will always be shown when LoRAConfig is created,
alerting users to the upcoming removal without breaking existing code.

Signed-off-by: Jinheng Li <ahengljh@gmail.com>
@ahengljh ahengljh force-pushed the deprecate-lora-extra-vocab branch from baf7e52 to 2699b51 Compare August 28, 2025 02:57
@jeejeelee jeejeelee enabled auto-merge (squash) August 28, 2025 04:28
@WoosukKwon WoosukKwon disabled auto-merge August 28, 2025 05:34
@WoosukKwon WoosukKwon merged commit c8851a4 into vllm-project:main Aug 28, 2025
41 checks passed
zhewenl pushed a commit to zhewenl/vllm that referenced this pull request Aug 28, 2025
zhewenl pushed a commit to zhewenl/vllm that referenced this pull request Sep 3, 2025
eicherseiji pushed a commit to eicherseiji/vllm that referenced this pull request Sep 9, 2025
FeiDaLI pushed a commit to FeiDaLI/vllm that referenced this pull request Sep 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants