Skip to content

Conversation

@Potabk
Copy link
Collaborator

@Potabk Potabk commented May 12, 2025

What this PR does / why we need it?

For the #17962 has merged, vllm openapi server can now launch normally on python==3.10, we re-enable the related tests

Does this PR introduce any user-facing change?

How was this patch tested?

CI passed

Signed-off-by: wangli <wangli858794774@gmail.com>
Copy link
Collaborator

@Yikun Yikun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for recover, LGTM if CI passed

@wangxiyuan wangxiyuan merged commit 4df1e99 into vllm-project:main May 12, 2025
8 checks passed
@wangxiyuan
Copy link
Collaborator

Thanks

@Potabk Potabk deleted the bugfix branch May 12, 2025 08:08
chopper0126 pushed a commit to chopper0126/vllm-ascend that referenced this pull request Oct 16, 2025
### What this PR does / why we need it?
For the
[#17962](vllm-project/vllm#17962)
has merged, vllm openapi server can now launch normally on python==3.10,
we re-enable the related tests

Signed-off-by: wangli <wangli858794774@gmail.com>
Angazenn pushed a commit to Angazenn/vllm-ascend that referenced this pull request Oct 21, 2025
### What this PR does / why we need it?
For the
[#17962](vllm-project/vllm#17962)
has merged, vllm openapi server can now launch normally on python==3.10,
we re-enable the related tests

Signed-off-by: wangli <wangli858794774@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants