Skip to content

Conversation

@shen-shanshan
Copy link
Collaborator

What this PR does / why we need it?

Run:

pytest -sv tests/singlecard/test_guided_decoding.py

Come across an error:

FAILED tests/singlecard/test_guided_decoding.py::test_guided_json_completion[guidance:disable-any-whitespace] - NotImplementedError: VLLM_USE_V1=1 is not supported with --guided-decoding-backend=guidance:disable-any-whitespace.
FAILED tests/singlecard/test_guided_decoding.py::test_guided_regex[guidance:disable-any-whitespace] - NotImplementedError: VLLM_USE_V1=1 is not supported with --guided-decoding-backend=guidance:disable-any-whitespace.

Does this PR introduce any user-facing change?

How was this patch tested?

Run:

pytest -sv tests/singlecard/test_guided_decoding.py

Output:

=========================================================================== test session starts ===========================================================================
platform linux -- Python 3.10.18, pytest-8.4.1, pluggy-1.6.0
rootdir: /home/sss/github/vllm-v0.9.1/vllm-ascend
configfile: pytest.ini
plugins: anyio-4.10.0, mock-3.14.1
collected 8 items                                                                                                                                                         

tests/singlecard/test_guided_decoding.py ss.ss..s                                                                                                                   [100%]

============================================================================ warnings summary =============================================================================
<frozen importlib._bootstrap>:241
  <frozen importlib._bootstrap>:241: DeprecationWarning: builtin type SwigPyPacked has no __module__ attribute

<frozen importlib._bootstrap>:241
  <frozen importlib._bootstrap>:241: DeprecationWarning: builtin type SwigPyObject has no __module__ attribute

../../../miniconda3/envs/vllm-v0.9.1/lib/python3.10/site-packages/torch_npu/dynamo/torchair/__init__.py:8
  /home/sss/miniconda3/envs/vllm-v0.9.1/lib/python3.10/site-packages/torch_npu/dynamo/torchair/__init__.py:8: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
    import pkg_resources

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
========================================================== 3 passed, 5 skipped, 3 warnings in 193.21s (0:03:13) ===========================================================

Signed-off-by: shen-shanshan <467638484@qq.com>
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses a bug in the guided decoding tests where an invalid backend, guidance:disable-any-whitespace, was causing failures when running with VLLM_USE_V1=1. The fix correctly replaces the invalid backend with guidance, which resolves the NotImplementedError and allows the tests to pass as expected. The change is minimal, targeted, and effectively resolves the issue.

@Yikun
Copy link
Collaborator

Yikun commented Aug 30, 2025

so just a test fixed? Why it was failure in history CI?

@shen-shanshan
Copy link
Collaborator Author

shen-shanshan commented Aug 30, 2025

so just a test fixed? Why it was failure in history CI?

I am also comfused about this. 😂 How does the CI passed in history v0.9.1 PR?

Signed-off-by: shen-shanshan <467638484@qq.com>
Signed-off-by: shen-shanshan <467638484@qq.com>
f"{guided_decoding_backend} will fall back to outlines, skip it")
if guided_decoding_backend == "outlines":
pytest.skip(
f"{guided_decoding_backend} will take up too much time for json "
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How much time does this case cost? if it is reasonable, I think we'd better keep it running, as the CI on v0.9.1-dev actually is not too much

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe 10+ minutes...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, then let's keep it skipped

@wangxiyuan wangxiyuan merged commit 234a5a4 into vllm-project:v0.9.1-dev Sep 1, 2025
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants