Skip to content

Commit 6d33d22

Browse files
jeejeeleeeicherseiji
authored andcommitted
[Bugfix] Fix test_mixtral_moe (vllm-project#24371)
1 parent 4420872 commit 6d33d22

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

tests/kernels/moe/test_moe.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -371,8 +371,8 @@ def test_fused_moe_wn16(m: int, n: int, k: int, e: int, topk: int,
371371
@pytest.mark.parametrize(
372372
"use_rocm_aiter", [True, False] if current_platform.is_rocm() else [False])
373373
@torch.inference_mode()
374-
def test_mixtral_moe(dtype: torch.dtype, padding: bool, use_rocm_aiter: bool,
375-
monkeypatch):
374+
def test_mixtral_moe(dist_init, dtype: torch.dtype, padding: bool,
375+
use_rocm_aiter: bool, monkeypatch):
376376
"""Make sure our Mixtral MoE implementation agrees with the one from
377377
huggingface."""
378378

0 commit comments

Comments
 (0)