Skip to content

Conversation

@ganyi1996ppo
Copy link
Collaborator

@ganyi1996ppo ganyi1996ppo commented Jun 17, 2025

What this PR does / why we need it?

This PR is the cherry-pick of the PR #1229 which have already merged into the main branch.

This PR is used for resolved issue 1147

  1. Move fused_moe code into one file fused_moe.py.
  2. Integrate branch conditions into function get_fused_moe_state.

Does this PR introduce any user-facing change?

  1. This PR has removed the env VLLM_ENABLE_MC2, because I think this env is useless, we can make judgments based on the current scenario without this env, it will only increase complexity.
  2. This PR has removed the env USING_LCCL_COM, because this env has already expired.
  3. additional_config.expert_tensor_parallel_size has already expired, and now we also use parameter enable_expert_parallel, consistent with the vLLM.

How was this patch tested?

CI passed

<!--  Thanks for sending a pull request!

BEFORE SUBMITTING, PLEASE READ
https://docs.vllm.ai/en/latest/contributing/overview.html

-->
This PR is used for resolved [issue
1147](vllm-project#1147)
1. Move fused_moe code into one file `fused_moe.py`.
2. Integrate branch conditions into function `get_fused_moe_state`.
<!--
- Please clarify what changes you are proposing. The purpose of this
section is to outline the changes and how this PR fixes the issue.
If possible, please consider writing useful notes for better and faster
reviews in your PR.

- Please clarify why the changes are needed. For instance, the use case
and bug description.

- Fixes #
-->

1. This PR has removed the env `VLLM_ENABLE_MC2`, because I think this
env is useless, we can make judgments based on the current scenario
without this env, it will only increase complexity.
2. This PR has removed the env `USING_LCCL_COM`, because this env has
already expired.
3. `additional_config.expert_tensor_parallel_size` has already expired,
and now we also use parameter `enable_expert_parallel`, consistent with
the vLLM.
<!--
Note that it means *any* user-facing change including all aspects such
as API, interface or other behavior changes.
Documentation-only updates are not considered user-facing changes.
-->

<!--
CI passed with new added/existing test.
If it was tested in a way different from regular unit tests, please
clarify how you tested step by step, ideally copy and paste-able, so
that other reviewers can test and check, and descendants can verify in
the future.
If tests were not added, please describe why they were not added and/or
why it was difficult to add.
-->

Signed-off-by: zzzzwwjj <1183291235@qq.com>
Signed-off-by: ganyi <pleaplusone.gy@gmail.com>
@ganyi1996ppo
Copy link
Collaborator Author

@zzzzwwjj there is a conflict in model_runner_v1.py, please review this code change.

@Yikun Yikun changed the title [cherry-pick][refactor] Refactoring AscendFusedMoE (#1229) [v0.9.1][refactor] Refactoring AscendFusedMoE (#1229) Jun 17, 2025
@Yikun Yikun merged commit 733b0a2 into vllm-project:v0.9.1-dev Jun 17, 2025
16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants