Skip to content

Conversation

@BoyuanFeng
Copy link
Contributor

@BoyuanFeng BoyuanFeng commented Jun 13, 2025

#18846 turns standalone_compile on by default via is_torch_equal_or_newer("2.8.0"). One major motivation is to use standalone_compile for vllm x torch nightly CI tests.

However, it has no effect yet. The current torch nightly gives 2.8.0a0+... which is a pre-release and is considered to be less than 2.8.0. This PR fixes the issue by comparing with 2.8.0a.
image

@gemini-code-assist
Copy link
Contributor

Warning

You have reached your daily quota limit. Please wait up to 24 hours and I will start processing your requests again!

@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

Signed-off-by: Boyuan Feng <boyuan@meta.com>
Signed-off-by: Boyuan Feng <boyuan@meta.com>
@BoyuanFeng BoyuanFeng force-pushed the bf/torch-base-version branch from ec866e7 to 7a4723c Compare June 13, 2025 02:12
@zou3519 zou3519 added the ready ONLY add when PR is ready to merge/full CI is needed label Jun 13, 2025
@zou3519 zou3519 self-requested a review June 13, 2025 02:18
@zou3519
Copy link
Collaborator

zou3519 commented Jun 13, 2025

cc @drisspg @jerryzh168, turns out standalone_compile was not enabled by default for nightlies yet. The flag VLLM_USE_STANDALONE_COMPILE=1 still works for testing.

if compilation_config.use_inductor:
if envs.VLLM_USE_STANDALONE_COMPILE and is_torch_equal_or_newer(
"2.8.0"):
"2.8.0a"):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, what does "a" in "2.8.0a" mean here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

'a' is a standard version for alpha: https://peps.python.org/pep-0440/
current pytorch nightly has the version 2.8.0a0+git093fd47

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see, thanks for the explanation. So the version parser thinks 2.8.0 > 2.8.0a then.

@houseroad houseroad merged commit ce688ad into vllm-project:main Jun 13, 2025
89 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants