Skip to content

Conversation

@rasmith
Copy link
Contributor

@rasmith rasmith commented Apr 26, 2025

This PR removes a check in arg_utils.py for FP8 that is not necessary. It turns out that a different backend is used for FP8 when running FP8 on V1 so it will not hit the Triton backend, even if VLLM_USE_TRITON_FLASH_ATTENTION=1 is set.

Signed-off-by: Randall Smith <Randall.Smith@amd.com>
@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@rasmith rasmith changed the title [AMD][FP8][BugFix] Remove V1 check in arg_utils.py for fp8 since it is not necessary [AMD][FP8][BugFix] Remove V1 check in arg_utils.py for FP8 since it is not necessary Apr 26, 2025
@tlrmchlsmth tlrmchlsmth added the ready ONLY add when PR is ready to merge/full CI is needed label Apr 26, 2025
@tlrmchlsmth tlrmchlsmth enabled auto-merge (squash) April 26, 2025 01:22
@vllm-bot vllm-bot merged commit 68af5f6 into vllm-project:main Apr 26, 2025
34 of 38 checks passed
jikunshang pushed a commit to jikunshang/vllm that referenced this pull request Apr 29, 2025
…s not necessary (vllm-project#17215)

Signed-off-by: Randall Smith <Randall.Smith@amd.com>
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Apr 29, 2025
…s not necessary (vllm-project#17215)

Signed-off-by: Randall Smith <Randall.Smith@amd.com>
adobrzyn pushed a commit to HabanaAI/vllm-fork that referenced this pull request Apr 30, 2025
…s not necessary (vllm-project#17215)

Signed-off-by: Randall Smith <Randall.Smith@amd.com>
Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
…s not necessary (vllm-project#17215)

Signed-off-by: Randall Smith <Randall.Smith@amd.com>
Signed-off-by: Mu Huai <tianbowen.tbw@antgroup.com>
zzzyq pushed a commit to zzzyq/vllm that referenced this pull request May 24, 2025
…s not necessary (vllm-project#17215)

Signed-off-by: Randall Smith <Randall.Smith@amd.com>
Signed-off-by: Yuqi Zhang <yuqizhang@google.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants