Skip to content

Conversation

@wangxiyuan
Copy link
Collaborator

vllm-ascend now only support moe for deepseek. We should add common moe support back

@Yikun Yikun mentioned this pull request Apr 28, 2025
16 tasks
@wangxiyuan wangxiyuan force-pushed the fix_moe branch 3 times, most recently from 898fe69 to 34b5aae Compare April 28, 2025 13:18
Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
@wangxiyuan wangxiyuan changed the title [BugFix] Fix moe error [Model] Support common fused moe ops for moe model, such as Qwen3Moe Apr 28, 2025
@Yikun
Copy link
Collaborator

Yikun commented Apr 28, 2025

Consider 0.8.4 CI already work, let's merge this quickly

@wangxiyuan wangxiyuan merged commit 1fce70a into vllm-project:main Apr 28, 2025
13 checks passed
@wangxiyuan wangxiyuan changed the title [Model] Support common fused moe ops for moe model, such as Qwen3Moe [Model] Support common fused moe ops for moe model Apr 28, 2025
@wangxiyuan wangxiyuan deleted the fix_moe branch April 30, 2025 01:39
chopper0126 pushed a commit to chopper0126/vllm-ascend that referenced this pull request Oct 16, 2025
…llm-project#709)

vllm-ascend now only support moe for deepseek. We should add common moe
support back

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Angazenn pushed a commit to Angazenn/vllm-ascend that referenced this pull request Oct 21, 2025
…llm-project#709)

vllm-ascend now only support moe for deepseek. We should add common moe
support back

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants