Skip to content

Conversation

@oteroantoniogom
Copy link
Contributor

@oteroantoniogom oteroantoniogom commented Mar 25, 2025

🔧 Fix: Support triton==3.3.0+git95326d9f for RTX 5090 (Unsloth + vLLM compatibility)

📝 Description

This PR fixes an incompatibility in the Triton kernel logic used in vllm/lora/ops/triton_ops/kernel_utils.py that causes vLLM to break when running with:

  • Unsloth: 2025.3.18
  • Unsloth_zoo: 2025.3.16
  • Triton: 3.3.0+git95326d9f
  • Torch: 2.8.0.dev20250324+cu128
  • Torchvision: 0.22.0.dev20250324+cu128
  • Torchaudio: 2.6.0.dev20250324+cu128
  • xformers: 0.0.30+4fa0149.d20250325

All tested using Unsloth and Qwen2.5-3B-Instruct, with an NVIDIA RTX 5090.


🐛 Problem

When running vLLM with Unsloth, the following error occurred:

AttributeError: 'tuple_type' object has no attribute 'is_ptr'

This was traced to incorrect pointer construction using:

a_ptr = (some_expr, )

Triton now treats this as a tuple_type, which breaks downstream logic that expects a pointer object.


✅ Fix

We removed unnecessary trailing commas in pointer assignments so that actual pointer_type objects are passed instead of tuples. For example:

- a_ptr = (cur_input_ptr + ram[:, None] * input_d1_stride + offset_k[None, :] * input_d2_stride, )
+ a_ptr = cur_input_ptr + ram[:, None] * input_d1_stride + offset_k[None, :] * input_d2_stride

🚀 Result

This fix makes vLLM fully compatible with Triton 3.3.0+, allowing Unsloth-based LoRA fine-tuning and inference to work out-of-the-box with newer GPUs like the RTX 5090.


🔒 Notes

  • This PR assumes the environment variable VLLM_INSTALL_PUNICA_KERNELS=1 is set during installation.
  • Installed using pip install -e . --no-build-isolation.

@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@jeejeelee
Copy link
Collaborator

I have unblocked all the LoRA-related tests to verify if all LoRA-CI can pass correctly.

Copy link
Member

@mgoin mgoin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems reasonable to me if existing tests are green since b_ptr is without the comma

@jeejeelee
Copy link
Collaborator

Seems reasonable to me if existing tests are green since b_ptr is without the comma

Yeah, I think this change doesn't effect on current lora ops.

@jeejeelee jeejeelee changed the title Fix: Support triton==3.3.0+git95326d9f for RTX 5090 (Unsloth + vLLM compatibility) [Bugfix] Support triton==3.3.0+git95326d9f for RTX 5090 (Unsloth + vLLM compatibility) Mar 25, 2025
@jeejeelee jeejeelee added the ready ONLY add when PR is ready to merge/full CI is needed label Mar 25, 2025
@jeejeelee jeejeelee enabled auto-merge (squash) March 25, 2025 15:50
@jeejeelee jeejeelee merged commit 5d8e1c9 into vllm-project:main Mar 25, 2025
53 checks passed
wrmedford pushed a commit to wrmedford/vllm that referenced this pull request Mar 26, 2025
…LM compatibility) (vllm-project#15471)

Co-authored-by: ServerAI <ai@exc-mad-ai.com>
Signed-off-by: Wes Medford <wryanmedford@gmail.com>
@woreom
Copy link

woreom commented Apr 1, 2025

I am still unable to install unsloth and vllm because using pip to install vllm is untrustworthy; it uninstalls triton 3.1.0, numpy 2.2.4, torch 2.5.1, xformers 0.0.28.post3, torchvision 0.20.1, and torchaudio 2.5.1. Additionally, trying to build also raises an error.

pip install git+https://github.com/vllm-project/vllm.git --no-build-isolation

Building wheels for collected packages: vllm
  Building editable for vllm (pyproject.toml) ... error
  error: subprocess-exited-with-error
  
  × Building editable for vllm (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [153 lines of output]
      running editable_wheel
      creating /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info
      writing /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/PKG-INFO
      writing dependency_links to /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/dependency_links.txt
      writing entry points to /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/entry_points.txt
      writing requirements to /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/requires.txt
      writing top-level names to /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/top_level.txt
      writing manifest file '/tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/SOURCES.txt'
      reading manifest template 'MANIFEST.in'
      adding license file 'LICENSE'
      writing manifest file '/tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/SOURCES.txt'
      creating '/tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm-0.8.3.dev151+ge6e3c55e.cu118.dist-info'
      creating /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm-0.8.3.dev151+ge6e3c55e.cu118.dist-info/WHEEL
      running build_py
      running build_ext
      -- The CXX compiler identification is GNU 11.4.0
      -- Detecting CXX compiler ABI info
      -- Detecting CXX compiler ABI info - done
      -- Check for working CXX compiler: /usr/bin/c++ - skipped
      -- Detecting CXX compile features
      -- Detecting CXX compile features - done
      -- Build type: RelWithDebInfo
      -- Target device: cuda
      -- Found Python: ~/miniconda3/envs/unsloth/bin/python3.11 (found version "3.11.11") found components: Interpreter Development.Module Development.SABIModule
      -- Found python matching: ~/miniconda3/envs/unsloth/bin/python3.11.
      -- Found CUDA: /usr/local/cuda (found version "11.8")
      -- The CUDA compiler identification is unknown
      CMake Error at ~/miniconda3/envs/unsloth/share/cmake-4.0/Modules/CMakeDetermineCUDACompiler.cmake:266 (message):
        Failed to detect a default CUDA architecture.
      
      
      
        Compiler output:
      
      Call Stack (most recent call first):
        ~/miniconda3/envs/unsloth/lib/python3.11/site-packages/torch/share/cmake/Caffe2/public/cuda.cmake:47 (enable_language)
        ~/miniconda3/envs/unsloth/lib/python3.11/site-packages/torch/share/cmake/Caffe2/Caffe2Config.cmake:86 (include)
        ~/miniconda3/envs/unsloth/lib/python3.11/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:68 (find_package)
        CMakeLists.txt:81 (find_package)
      
      
      -- Configuring incomplete, errors occurred!
      Traceback (most recent call last):
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 139, in run
          self._create_wheel_file(bdist_wheel)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 340, in _create_wheel_file
          files, mapping = self._run_build_commands(dist_name, unpacked, lib, tmp)
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 263, in _run_build_commands
          self._run_build_subcommands()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 290, in _run_build_subcommands
          self.run_command(name)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/cmd.py", line 339, in run_command
          self.distribution.run_command(command)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/dist.py", line 999, in run_command
          super().run_command(command)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 1002, in run_command
          cmd_obj.run()
        File "<string>", line 267, in run
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/build_ext.py", line 99, in run
          _build_ext.run(self)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/command/build_ext.py", line 365, in run
          self.build_extensions()
        File "<string>", line 226, in build_extensions
        File "<string>", line 204, in configure
        File "~/miniconda3/envs/unsloth/lib/python3.11/subprocess.py", line 413, in check_call
          raise CalledProcessError(retcode, cmd)
      subprocess.CalledProcessError: Command '['cmake', '/mnt/d/Rowan/found-in-com/vlms-in-wireless-communication/vllm', '-G', 'Ninja', '-DCMAKE_BUILD_TYPE=RelWithDebInfo', '-DVLLM_TARGET_DEVICE=cuda', '-DVLLM_PYTHON_EXECUTABLE=~/miniconda3/envs/unsloth/bin/python3.11', '-DVLLM_PYTHON_PATH=~/miniconda3/envs/unsloth/lib/python311.zip:~/miniconda3/envs/unsloth/lib/python3.11:~/miniconda3/envs/unsloth/lib/python3.11/lib-dynload:~/.local/lib/python3.11/site-packages:~/miniconda3/envs/unsloth/lib/python3.11/site-packages:~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_vendor', '-DFETCHCONTENT_BASE_DIR=/mnt/d/Rowan/found-in-com/vlms-in-wireless-communication/vllm/.deps', '-DNVCC_THREADS=1', '-DCMAKE_JOB_POOL_COMPILE:STRING=compile', '-DCMAKE_JOB_POOLS:STRING=compile=32']' returned non-zero exit status 1.
      ~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/dist.py:1002: _DebuggingTips: Problem in editable installation.
      !!
      
              ********************************************************************************
              An error happened while installing `vllm` in editable mode.
      
              The following steps are recommended to help debug this problem:
      
              - Try to install the project normally, without using the editable mode.
                Does the error still persist?
                (If it does, try fixing the problem before attempting the editable mode).
              - If you are using binary extensions, make sure you have all OS-level
                dependencies installed (e.g. compilers, toolchains, binary libraries, ...).
              - Try the latest version of setuptools (maybe the error was already fixed).
              - If you (or your project dependencies) are using any setuptools extension
                or customization, make sure they support the editable mode.
      
              After following the steps above, if the problem still persists and
              you think this is related to how setuptools handles editable installations,
              please submit a reproducible example
              (see https://stackoverflow.com/help/minimal-reproducible-example) to:
      
                  https://github.com/pypa/setuptools/issues
      
              See https://setuptools.pypa.io/en/latest/userguide/development_mode.html for details.
              ********************************************************************************
      
      !!
        cmd_obj.run()
      Traceback (most recent call last):
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 389, in <module>
          main()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 373, in main
          json_out["return_val"] = hook(**hook_input["kwargs"])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 303, in build_editable
          return hook(wheel_directory, config_settings, metadata_directory)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/build_meta.py", line 476, in build_editable
          return self._build_with_temp_dir(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/build_meta.py", line 407, in _build_with_temp_dir
          self.run_setup()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/build_meta.py", line 320, in run_setup
          exec(code, locals())
        File "<string>", line 675, in <module>
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/__init__.py", line 117, in setup
          return distutils.core.setup(**attrs)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/core.py", line 186, in setup
          return run_commands(dist)
                 ^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/core.py", line 202, in run_commands
          dist.run_commands()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 983, in run_commands
          self.run_command(cmd)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/dist.py", line 999, in run_command
          super().run_command(command)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 1002, in run_command
          cmd_obj.run()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 139, in run
          self._create_wheel_file(bdist_wheel)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 340, in _create_wheel_file
          files, mapping = self._run_build_commands(dist_name, unpacked, lib, tmp)
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 263, in _run_build_commands
          self._run_build_subcommands()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 290, in _run_build_subcommands
          self.run_command(name)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/cmd.py", line 339, in run_command
          self.distribution.run_command(command)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/dist.py", line 999, in run_command
          super().run_command(command)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 1002, in run_command
          cmd_obj.run()
        File "<string>", line 267, in run
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/build_ext.py", line 99, in run
          _build_ext.run(self)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/command/build_ext.py", line 365, in run
          self.build_extensions()
        File "<string>", line 226, in build_extensions
        File "<string>", line 204, in configure
        File "~/miniconda3/envs/unsloth/lib/python3.11/subprocess.py", line 413, in check_call
          raise CalledProcessError(retcode, cmd)
      subprocess.CalledProcessError: Command '['cmake', '/mnt/d/Rowan/found-in-com/vlms-in-wireless-communication/vllm', '-G', 'Ninja', '-DCMAKE_BUILD_TYPE=RelWithDebInfo', '-DVLLM_TARGET_DEVICE=cuda', '-DVLLM_PYTHON_EXECUTABLE=~/miniconda3/envs/unsloth/bin/python3.11', '-DVLLM_PYTHON_PATH=~/miniconda3/envs/unsloth/lib/python311.zip:~/miniconda3/envs/unsloth/lib/python3.11:~/miniconda3/envs/unsloth/lib/python3.11/lib-dynload:~/.local/lib/python3.11/site-packages:~/miniconda3/envs/unsloth/lib/python3.11/site-packages:~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_vendor', '-DFETCHCONTENT_BASE_DIR=/mnt/d/Rowan/found-in-com/vlms-in-wireless-communication/vllm/.deps', '-DNVCC_THREADS=1', '-DCMAKE_JOB_POOL_COMPILE:STRING=compile', '-DCMAKE_JOB_POOLS:STRING=compile=32']' returned non-zero exit status 1.
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building editable for vllm
Failed to build vllm
ERROR: Failed to build installable wheels for some pyproject.toml based projects (vllm)

minimal code to create error:

#!/bin/zsh

# Check if unsloth environment exists and delete it if found
if conda info --envs | grep -q "^unsloth "; then
    echo "Found existing 'unsloth' environment. Removing it..."
    conda env remove -n unsloth -y
else
    echo "No existing 'unsloth' environment found."
fi

# Create new conda environment
echo "Creating new 'unsloth' environment..."

conda create --name unsloth \
        python=3.11 \
        pytorch-cuda=12.1 \
        pytorch==2.5.1 torchvision==0.20.1 torchaudio==2.5.1 cudatoolkit xformers -c pytorch -c nvidia -c xformers \
        -y

conda init 
conda activate unsloth

conda install -n unsloth -c conda-forge ipywidgets -y


# Install packages using conda run (safer in scripts than direct activation)
echo "Installing required packages..."
conda run -n unsloth pip install setuptools setuptools_scm wheel build  
conda run -n unsloth conda install cmake -y
conda run -n unsloth pip install unsloth
conda run -n unsloth pip install ipykernel numpy tqdm scipy==1.13.1 pandas scikit-learn trl peft
# conda run -n unsloth pip install git+https://github.com/vllm-project/vllm.git --no-build-isolation


echo "Setup complete! Use 'conda activate unsloth' to activate the environment."

@XvKuoMing
Copy link

I am still unable to install unsloth and vllm because using pip to install vllm is untrustworthy; it uninstalls triton 3.1.0, numpy 2.2.4, torch 2.5.1, xformers 0.0.28.post3, torchvision 0.20.1, and torchaudio 2.5.1. Additionally, trying to build also raises an error.

pip install git+https://github.com/vllm-project/vllm.git --no-build-isolation

Building wheels for collected packages: vllm
  Building editable for vllm (pyproject.toml) ... error
  error: subprocess-exited-with-error
  
  × Building editable for vllm (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [153 lines of output]
      running editable_wheel
      creating /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info
      writing /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/PKG-INFO
      writing dependency_links to /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/dependency_links.txt
      writing entry points to /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/entry_points.txt
      writing requirements to /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/requires.txt
      writing top-level names to /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/top_level.txt
      writing manifest file '/tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/SOURCES.txt'
      reading manifest template 'MANIFEST.in'
      adding license file 'LICENSE'
      writing manifest file '/tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm.egg-info/SOURCES.txt'
      creating '/tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm-0.8.3.dev151+ge6e3c55e.cu118.dist-info'
      creating /tmp/pip-wheel-nf6mgmhh/.tmp-pcopxehe/vllm-0.8.3.dev151+ge6e3c55e.cu118.dist-info/WHEEL
      running build_py
      running build_ext
      -- The CXX compiler identification is GNU 11.4.0
      -- Detecting CXX compiler ABI info
      -- Detecting CXX compiler ABI info - done
      -- Check for working CXX compiler: /usr/bin/c++ - skipped
      -- Detecting CXX compile features
      -- Detecting CXX compile features - done
      -- Build type: RelWithDebInfo
      -- Target device: cuda
      -- Found Python: ~/miniconda3/envs/unsloth/bin/python3.11 (found version "3.11.11") found components: Interpreter Development.Module Development.SABIModule
      -- Found python matching: ~/miniconda3/envs/unsloth/bin/python3.11.
      -- Found CUDA: /usr/local/cuda (found version "11.8")
      -- The CUDA compiler identification is unknown
      CMake Error at ~/miniconda3/envs/unsloth/share/cmake-4.0/Modules/CMakeDetermineCUDACompiler.cmake:266 (message):
        Failed to detect a default CUDA architecture.
      
      
      
        Compiler output:
      
      Call Stack (most recent call first):
        ~/miniconda3/envs/unsloth/lib/python3.11/site-packages/torch/share/cmake/Caffe2/public/cuda.cmake:47 (enable_language)
        ~/miniconda3/envs/unsloth/lib/python3.11/site-packages/torch/share/cmake/Caffe2/Caffe2Config.cmake:86 (include)
        ~/miniconda3/envs/unsloth/lib/python3.11/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:68 (find_package)
        CMakeLists.txt:81 (find_package)
      
      
      -- Configuring incomplete, errors occurred!
      Traceback (most recent call last):
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 139, in run
          self._create_wheel_file(bdist_wheel)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 340, in _create_wheel_file
          files, mapping = self._run_build_commands(dist_name, unpacked, lib, tmp)
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 263, in _run_build_commands
          self._run_build_subcommands()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 290, in _run_build_subcommands
          self.run_command(name)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/cmd.py", line 339, in run_command
          self.distribution.run_command(command)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/dist.py", line 999, in run_command
          super().run_command(command)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 1002, in run_command
          cmd_obj.run()
        File "<string>", line 267, in run
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/build_ext.py", line 99, in run
          _build_ext.run(self)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/command/build_ext.py", line 365, in run
          self.build_extensions()
        File "<string>", line 226, in build_extensions
        File "<string>", line 204, in configure
        File "~/miniconda3/envs/unsloth/lib/python3.11/subprocess.py", line 413, in check_call
          raise CalledProcessError(retcode, cmd)
      subprocess.CalledProcessError: Command '['cmake', '/mnt/d/Rowan/found-in-com/vlms-in-wireless-communication/vllm', '-G', 'Ninja', '-DCMAKE_BUILD_TYPE=RelWithDebInfo', '-DVLLM_TARGET_DEVICE=cuda', '-DVLLM_PYTHON_EXECUTABLE=~/miniconda3/envs/unsloth/bin/python3.11', '-DVLLM_PYTHON_PATH=~/miniconda3/envs/unsloth/lib/python311.zip:~/miniconda3/envs/unsloth/lib/python3.11:~/miniconda3/envs/unsloth/lib/python3.11/lib-dynload:~/.local/lib/python3.11/site-packages:~/miniconda3/envs/unsloth/lib/python3.11/site-packages:~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_vendor', '-DFETCHCONTENT_BASE_DIR=/mnt/d/Rowan/found-in-com/vlms-in-wireless-communication/vllm/.deps', '-DNVCC_THREADS=1', '-DCMAKE_JOB_POOL_COMPILE:STRING=compile', '-DCMAKE_JOB_POOLS:STRING=compile=32']' returned non-zero exit status 1.
      ~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/dist.py:1002: _DebuggingTips: Problem in editable installation.
      !!
      
              ********************************************************************************
              An error happened while installing `vllm` in editable mode.
      
              The following steps are recommended to help debug this problem:
      
              - Try to install the project normally, without using the editable mode.
                Does the error still persist?
                (If it does, try fixing the problem before attempting the editable mode).
              - If you are using binary extensions, make sure you have all OS-level
                dependencies installed (e.g. compilers, toolchains, binary libraries, ...).
              - Try the latest version of setuptools (maybe the error was already fixed).
              - If you (or your project dependencies) are using any setuptools extension
                or customization, make sure they support the editable mode.
      
              After following the steps above, if the problem still persists and
              you think this is related to how setuptools handles editable installations,
              please submit a reproducible example
              (see https://stackoverflow.com/help/minimal-reproducible-example) to:
      
                  https://github.com/pypa/setuptools/issues
      
              See https://setuptools.pypa.io/en/latest/userguide/development_mode.html for details.
              ********************************************************************************
      
      !!
        cmd_obj.run()
      Traceback (most recent call last):
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 389, in <module>
          main()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 373, in main
          json_out["return_val"] = hook(**hook_input["kwargs"])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 303, in build_editable
          return hook(wheel_directory, config_settings, metadata_directory)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/build_meta.py", line 476, in build_editable
          return self._build_with_temp_dir(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/build_meta.py", line 407, in _build_with_temp_dir
          self.run_setup()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/build_meta.py", line 320, in run_setup
          exec(code, locals())
        File "<string>", line 675, in <module>
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/__init__.py", line 117, in setup
          return distutils.core.setup(**attrs)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/core.py", line 186, in setup
          return run_commands(dist)
                 ^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/core.py", line 202, in run_commands
          dist.run_commands()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 983, in run_commands
          self.run_command(cmd)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/dist.py", line 999, in run_command
          super().run_command(command)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 1002, in run_command
          cmd_obj.run()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 139, in run
          self._create_wheel_file(bdist_wheel)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 340, in _create_wheel_file
          files, mapping = self._run_build_commands(dist_name, unpacked, lib, tmp)
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 263, in _run_build_commands
          self._run_build_subcommands()
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/editable_wheel.py", line 290, in _run_build_subcommands
          self.run_command(name)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/cmd.py", line 339, in run_command
          self.distribution.run_command(command)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/dist.py", line 999, in run_command
          super().run_command(command)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 1002, in run_command
          cmd_obj.run()
        File "<string>", line 267, in run
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/command/build_ext.py", line 99, in run
          _build_ext.run(self)
        File "~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_distutils/command/build_ext.py", line 365, in run
          self.build_extensions()
        File "<string>", line 226, in build_extensions
        File "<string>", line 204, in configure
        File "~/miniconda3/envs/unsloth/lib/python3.11/subprocess.py", line 413, in check_call
          raise CalledProcessError(retcode, cmd)
      subprocess.CalledProcessError: Command '['cmake', '/mnt/d/Rowan/found-in-com/vlms-in-wireless-communication/vllm', '-G', 'Ninja', '-DCMAKE_BUILD_TYPE=RelWithDebInfo', '-DVLLM_TARGET_DEVICE=cuda', '-DVLLM_PYTHON_EXECUTABLE=~/miniconda3/envs/unsloth/bin/python3.11', '-DVLLM_PYTHON_PATH=~/miniconda3/envs/unsloth/lib/python311.zip:~/miniconda3/envs/unsloth/lib/python3.11:~/miniconda3/envs/unsloth/lib/python3.11/lib-dynload:~/.local/lib/python3.11/site-packages:~/miniconda3/envs/unsloth/lib/python3.11/site-packages:~/miniconda3/envs/unsloth/lib/python3.11/site-packages/setuptools/_vendor', '-DFETCHCONTENT_BASE_DIR=/mnt/d/Rowan/found-in-com/vlms-in-wireless-communication/vllm/.deps', '-DNVCC_THREADS=1', '-DCMAKE_JOB_POOL_COMPILE:STRING=compile', '-DCMAKE_JOB_POOLS:STRING=compile=32']' returned non-zero exit status 1.
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building editable for vllm
Failed to build vllm
ERROR: Failed to build installable wheels for some pyproject.toml based projects (vllm)

minimal code to create error:

#!/bin/zsh

# Check if unsloth environment exists and delete it if found
if conda info --envs | grep -q "^unsloth "; then
    echo "Found existing 'unsloth' environment. Removing it..."
    conda env remove -n unsloth -y
else
    echo "No existing 'unsloth' environment found."
fi

# Create new conda environment
echo "Creating new 'unsloth' environment..."

conda create --name unsloth \
        python=3.11 \
        pytorch-cuda=12.1 \
        pytorch==2.5.1 torchvision==0.20.1 torchaudio==2.5.1 cudatoolkit xformers -c pytorch -c nvidia -c xformers \
        -y

conda init 
conda activate unsloth

conda install -n unsloth -c conda-forge ipywidgets -y


# Install packages using conda run (safer in scripts than direct activation)
echo "Installing required packages..."
conda run -n unsloth pip install setuptools setuptools_scm wheel build  
conda run -n unsloth conda install cmake -y
conda run -n unsloth pip install unsloth
conda run -n unsloth pip install ipykernel numpy tqdm scipy==1.13.1 pandas scikit-learn trl peft
# conda run -n unsloth pip install git+https://github.com/vllm-project/vllm.git --no-build-isolation


echo "Setup complete! Use 'conda activate unsloth' to activate the environment."

I have the same issue

lulmer pushed a commit to lulmer/vllm that referenced this pull request Apr 7, 2025
…LM compatibility) (vllm-project#15471)

Co-authored-by: ServerAI <ai@exc-mad-ai.com>
Signed-off-by: Louis Ulmer <ulmerlouis@gmail.com>
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Apr 29, 2025
…LM compatibility) (vllm-project#15471)

Co-authored-by: ServerAI <ai@exc-mad-ai.com>
shreyankg pushed a commit to shreyankg/vllm that referenced this pull request May 3, 2025
…LM compatibility) (vllm-project#15471)

Co-authored-by: ServerAI <ai@exc-mad-ai.com>
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
…LM compatibility) (vllm-project#15471)

Co-authored-by: ServerAI <ai@exc-mad-ai.com>
Signed-off-by: Mu Huai <tianbowen.tbw@antgroup.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants