Skip to content

Conversation

Isotr0py
Copy link
Member

@Isotr0py Isotr0py commented Sep 6, 2025

Purpose

Also cc @minosfuture, can you check if this PR also fixed the issue on your side? Thanks!

Test Plan

vllm serve Isotr0py/DeepSeek-V3-0324-tiny -tp 2 --enforce-eager --trust-remote-code

Test Result

Verified on dummy tiny Deepseek-v3 FP8 checkpoint with 1 MLA layer.


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

Signed-off-by: Isotr0py <mozf@mail2.sysu.edu.cn>
@DarkLight1337 DarkLight1337 enabled auto-merge (squash) September 6, 2025 12:58
@DarkLight1337 DarkLight1337 added the ready ONLY add when PR is ready to merge/full CI is needed label Sep 6, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request aims to fix an issue with loading DeepSeek FP8 weights with tensor parallelism. The changes involve renaming __post_init__ to update_param_tp_status in LinearBase and calling it explicitly in ColumnParallelLinear and RowParallelLinear, which is a correct approach since nn.Module doesn't support __post_init__. Additionally, the logic for retrieving tp_size in Fp8LinearMethod is updated to respect layer-specific tensor parallelism settings.

My main feedback is a potential omission in update_param_tp_status. While the PR description mentions updating both tp_size and tp_rank, the implementation only seems to update tp_rank. This could lead to issues if tp_size is required by the parameter loading logic. I've added a specific comment with a suggestion to address this.

@vllm-bot vllm-bot merged commit 00a4e56 into vllm-project:main Sep 6, 2025
47 of 51 checks passed
@Isotr0py Isotr0py deleted the fix-fp8-disable-tp branch September 6, 2025 16:30
eicherseiji pushed a commit to eicherseiji/vllm that referenced this pull request Sep 9, 2025
Signed-off-by: Isotr0py <mozf@mail2.sysu.edu.cn>
skyloevil pushed a commit to skyloevil/vllm that referenced this pull request Sep 13, 2025
Signed-off-by: Isotr0py <mozf@mail2.sysu.edu.cn>
FeiDaLI pushed a commit to FeiDaLI/vllm that referenced this pull request Sep 25, 2025
Signed-off-by: Isotr0py <mozf@mail2.sysu.edu.cn>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 10, 2025
Signed-off-by: Isotr0py <mozf@mail2.sysu.edu.cn>
Signed-off-by: xuebwang-amd <xuebwang@amd.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

deepseek Related to DeepSeek models ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants