-
-
Notifications
You must be signed in to change notification settings - Fork 11.2k
Closed
Closed
Copy link
Labels
installationInstallation problemsInstallation problems
Description
Your current environment
2 x RTX 6000 PRO
python -m vllm.entrypoints.api_server --model cognitivecomputations/Qwen3-235B-A22B-AWQ --enable-reasoning --reasoning-parser deepseek_r1 -tp 2
ImportError: /home/giga/vllm/vllm/_C.abi3.so: undefined symbol: _Z35cutlass_blockwise_scaled_grouped_mmRN2at6TensorERKS0_S3_S3_S3_S3_S3_
How you are installing vllm
Verified NVIDIA-SMI for Driver 575.62 and CUDA 12.9
git clone https://github.com/vllm-project/vllm.git
cd vllm
python -m venv vllm
source ./vllm/bin/activate
pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu129 # note changed cu128 to cu129
python use_existing_torch.py
python -m pip install -r requirements/build.txt
python -m pip install -r requirements/common.txt
python -m pip install -e . --no-build-isolation -v
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
Metadata
Metadata
Assignees
Labels
installationInstallation problemsInstallation problems