Skip to content

Conversation

@russellb
Copy link
Member

Our previous behavior with xgrammar was to ensure that the response was
valid JSON. When the json_object output format is requested, the correct
behavior is to ensure the response is a JSON object. Valid JSON could
also be an array, a string, or a number, which is not desired here.

The fix is to explicitly use a json schema of {"type": "object"}. An
upgrade to xgrammar is necessary to fix a bug with xgrammar's support of
this schema, so we upgrade to 0.1.17.

Signed-off-by: Russell Bryant rbryant@redhat.com

@russellb russellb requested a review from mgoin as a code owner March 25, 2025 17:59
@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

Copy link
Collaborator

@aarnphm aarnphm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This overall looks good to me, I just have one follow questions wrt an options for any json schemas.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, should we also have an option to compile_builtin_json_schema?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe. You could also have a json schema of {} to get the same behavior. I'm not sure if there's a performance difference between the two.

From a user perspective though, it's still possible to get "give me any valid json", but I seriously doubt most people actually want that.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, I always thought that compile_builtin_json_schema is just compiling the default jsonschema spec, then users will have a prompt saying something like:

```Generate me an user for a game that has default 'strength', 'power', and 'abilities'``

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But you are correct, probably makes it more easier this way then.

@mergify mergify bot added the tpu Related to Google TPUs label Mar 27, 2025
@russellb russellb force-pushed the xgrammar-json-object-fix branch from 1a3c69b to e46b6eb Compare March 27, 2025 17:55
@mergify mergify bot removed the tpu Related to Google TPUs label Mar 28, 2025
@russellb russellb added the ready ONLY add when PR is ready to merge/full CI is needed label Mar 28, 2025
@russellb russellb requested a review from Copilot March 28, 2025 15:08
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes the handling of the json_object output format when using xgrammar by explicitly compiling a JSON schema with {"type": "object"}. It ensures that only a valid JSON object is produced rather than other JSON types.

  • Updated backend_xgrammar.py to compile the JSON schema instead of using the built-in JSON grammar.
  • Modified xgrammar_decoding.py to use the JSON schema for json_object with proper whitespace handling.
  • Removed the fallback error for json_object in guided_decoding/init.py.
  • Updated tests to assert that the generated output is a JSON object.

Reviewed Changes

Copilot reviewed 4 out of 5 changed files in this pull request and generated no comments.

File Description
vllm/v1/structured_output/backend_xgrammar.py Replaced compile_builtin_json_grammar with compile_json_schema using '{"type": "object"}'.
vllm/model_executor/guided_decoding/xgrammar_decoding.py Updated json_object branch to compile the proper JSON schema with correct whitespace handling.
vllm/model_executor/guided_decoding/init.py Removed the error fallback for json_object to support the new behavior.
tests/v1/entrypoints/llm/test_struct_output_generate.py Changed test assertion to verify the generated JSON is an object.
Files not reviewed (1)
  • requirements/common.txt: Language not supported

@mergify
Copy link

mergify bot commented Mar 28, 2025

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @russellb.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Mar 28, 2025
Our previous behavior with xgrammar was to ensure that the response was
valid JSON. When the json_object output format is requested, the correct
behavior is to ensure the response is a JSON object. Valid JSON could
also be an array, a string, or a number, which is not desired here.

The fix is to explicitly use a json schema of {"type": "object"}. An
upgrade to xgrammar is necessary to fix a bug with xgrammar's support of
this schema, so we upgrade to 0.1.17.

- mlc-ai/xgrammar#256
- mlc-ai/xgrammar#264

Signed-off-by: Russell Bryant <rbryant@redhat.com>
@russellb russellb force-pushed the xgrammar-json-object-fix branch from e46b6eb to fca77a5 Compare March 28, 2025 17:56
@mergify mergify bot removed the needs-rebase label Mar 28, 2025
Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Stamp

@vllm-bot vllm-bot merged commit 14e53ed into vllm-project:main Apr 2, 2025
57 of 59 checks passed
Alex4210987 pushed a commit to LeiWang1999/vllm-bitblas that referenced this pull request Apr 5, 2025
Signed-off-by: Russell Bryant <rbryant@redhat.com>
Signed-off-by: xinyuxiao <xinyuxiao2024@gmail.com>
lulmer pushed a commit to lulmer/vllm that referenced this pull request Apr 7, 2025
Signed-off-by: Russell Bryant <rbryant@redhat.com>
Signed-off-by: Louis Ulmer <ulmerlouis@gmail.com>
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Apr 29, 2025
Signed-off-by: Russell Bryant <rbryant@redhat.com>
shreyankg pushed a commit to shreyankg/vllm that referenced this pull request May 3, 2025
Signed-off-by: Russell Bryant <rbryant@redhat.com>
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
Signed-off-by: Russell Bryant <rbryant@redhat.com>
Signed-off-by: Mu Huai <tianbowen.tbw@antgroup.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ci/build ready ONLY add when PR is ready to merge/full CI is needed structured-output v1

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants