Skip to content

Commit f68cce8

Browse files
authored
[ci/build] fix broken tests in LLM.collective_rpc (#15350)
Signed-off-by: youkaichao <youkaichao@gmail.com>
1 parent 09b6a95 commit f68cce8

File tree

2 files changed

+3
-12
lines changed

2 files changed

+3
-12
lines changed

.buildkite/test-pipeline.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -515,7 +515,7 @@ steps:
515515
- vllm/worker/model_runner.py
516516
- entrypoints/llm/test_collective_rpc.py
517517
commands:
518-
- pytest -v -s entrypoints/llm/test_collective_rpc.py
518+
- VLLM_ENABLE_V1_MULTIPROCESSING=0 pytest -v -s entrypoints/llm/test_collective_rpc.py
519519
- pytest -v -s ./compile/test_basic_correctness.py
520520
- pytest -v -s ./compile/test_wrapper.py
521521
- VLLM_TEST_SAME_HOST=1 torchrun --nproc-per-node=4 distributed/test_same_node.py | grep 'Same node test passed'

tests/entrypoints/llm/test_collective_rpc.py

Lines changed: 2 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -21,18 +21,9 @@ def test_collective_rpc(tp_size, backend):
2121
def echo_rank(self):
2222
return self.rank
2323

24-
from vllm.worker.worker import Worker
25-
26-
class MyWorker(Worker):
27-
28-
def echo_rank(self):
29-
return self.rank
30-
3124
llm = LLM(model="meta-llama/Llama-3.2-1B-Instruct",
3225
enforce_eager=True,
3326
load_format="dummy",
3427
tensor_parallel_size=tp_size,
35-
distributed_executor_backend=backend,
36-
worker_cls=MyWorker)
37-
for method in ["echo_rank", echo_rank]:
38-
assert llm.collective_rpc(method) == list(range(tp_size))
28+
distributed_executor_backend=backend)
29+
assert llm.collective_rpc(echo_rank) == list(range(tp_size))

0 commit comments

Comments
 (0)