Skip to content

Commit 4e996e7

Browse files
committed
fix
Signed-off-by: Bill Nell <bnell@redhat.com>
1 parent 236c52d commit 4e996e7

File tree

1 file changed

+1
-1
lines changed
  • vllm/model_executor/layers/fused_moe

1 file changed

+1
-1
lines changed

vllm/model_executor/layers/fused_moe/layer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -381,7 +381,7 @@ def apply(
381381
zero_expert_type = getattr(layer, "zero_expert_type", None)
382382

383383
if enable_eplb:
384-
if not self.supports_eplb:
384+
if self.supports_eplb:
385385
assert expert_load_view is not None
386386
assert logical_to_physical_map is not None
387387
assert logical_replica_count is not None

0 commit comments

Comments
 (0)