Skip to content

Conversation

@wangxiyuan
Copy link
Contributor

additional_config is contained in vllm_config and can be used by custom platform. It doesn't affect any V1 feature or function and can be used by V1 correctly. This PR drop the related check for V1 engine.

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@russellb
Copy link
Member

Hey @robertgshaw2-redhat, I assume this check was intentional. Can you take a look?

@wangxiyuan
Copy link
Contributor Author

@robertgshaw2-redhat hi, may I ask for your review? we test V1 engine with additional_config. It works well. Anything I missed?

@robertgshaw2-redhat
Copy link
Collaborator

Good by me. I was not sure what this was for, so I disabled out of an abundance of caution.

@robertgshaw2-redhat robertgshaw2-redhat enabled auto-merge (squash) April 21, 2025 02:44
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Apr 21, 2025
@robertgshaw2-redhat
Copy link
Collaborator

Thanks for the PR!

@wangxiyuan
Copy link
Contributor Author

Test failure with lora error. Looks doesn't relate to this PR

@vllm-bot vllm-bot merged commit b9b4746 into vllm-project:main Apr 22, 2025
63 of 65 checks passed
frieda-huang pushed a commit to frieda-huang/vllm that referenced this pull request Apr 23, 2025
Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Signed-off-by: Frieda (Jingying) Huang <jingyingfhuang@gmail.com>
jikunshang pushed a commit to jikunshang/vllm that referenced this pull request Apr 29, 2025
Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Apr 29, 2025
Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
adobrzyn pushed a commit to HabanaAI/vllm-fork that referenced this pull request Apr 30, 2025
Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Signed-off-by: Mu Huai <tianbowen.tbw@antgroup.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants