Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError: libcudart.so.12 #1718

Closed
tranhoangnguyen03 opened this issue Nov 19, 2023 · 10 comments
Closed

ImportError: libcudart.so.12 #1718

tranhoangnguyen03 opened this issue Nov 19, 2023 · 10 comments
Labels
installation Installation problems

Comments

@tranhoangnguyen03
Copy link

Was working yesterday.
Today it gives the following error:

from vllm import cuda_utils
ImportError: libcudart.so.12: cannot open shared object file: No such file or directory

I've tried turning it off and on again multiple times.

@WoosukKwon
Copy link
Collaborator

WoosukKwon commented Nov 19, 2023

Hi @tranhoangnguyen03 , we just released a new version (v0.2.2) which depends on PyTorch 2.1 and CUDA 12.1. This might cause error if you environment does not have CUDA 12.1.

  1. Please upgrade PyTorch using pip install --upgrade torch and see if the error persists.
  2. If it does not work, please install vLLM + CUDA 11.8 following the instructions here.

@tranhoangnguyen03
Copy link
Author

In case anyone else runs into the same trouble as I did. Here's how to make it work on Google Colab:

!pip -q install --upgrade fschat accelerate autoawq vllm
!pip install torch==2.1.0+cu121 torchvision==0.16.0+cu121 torchaudio==2.1.0 torchtext==0.16.0+cpu torchdata==0.7.0 --index-url https://download.pytorch.org/whl/cu121

@Tanio253
Copy link

In case anyone else runs into the same trouble as I did. Here's how to make it work on Google Colab:

!pip -q install --upgrade fschat accelerate autoawq vllm
!pip install torch==2.1.0+cu121 torchvision==0.16.0+cu121 torchaudio==2.1.0 torchtext==0.16.0+cpu torchdata==0.7.0 --index-url https://download.pytorch.org/whl/cu121

Thank you! Super useful.

@Hukongtao
Copy link

Hi @tranhoangnguyen03 , we just released a new version (v0.2.2) which depends on PyTorch 2.1 and CUDA 12.1. This might cause error if you environment does not have CUDA 12.1.

  1. Please upgrade PyTorch using pip install --upgrade torch and see if the error persists.
  2. If it does not work, please install vLLM + CUDA 11.8 following the instructions here.

Is there a cuda11.7 version?

@PeterXiaTian
Copy link

Hi @tranhoangnguyen03 , we just released a new version (v0.2.2) which depends on PyTorch 2.1 and CUDA 12.1. This might cause error if you environment does not have CUDA 12.1.

  1. Please upgrade PyTorch using pip install --upgrade torch and see if the error persists.
  2. If it does not work, please install vLLM + CUDA 11.8 following the instructions here.

Is there a cuda11.7 version?

Is there a cuda11.1 version?

@datalee
Copy link

datalee commented Feb 21, 2024

CUDA 12.1.

Is there a cuda11.7 version?

@AWangji
Copy link

AWangji commented May 22, 2024

Hi @tranhoangnguyen03 , we just released a new version (v0.2.2) which depends on PyTorch 2.1 and CUDA 12.1. This might cause error if you environment does not have CUDA 12.1.

  1. Please upgrade PyTorch using pip install --upgrade torch and see if the error persists.
  2. If it does not work, please install vLLM + CUDA 11.8 following the instructions here.

Is there a cuda11.7 version?

same question, is there anyone solve it?

@DarkLight1337 DarkLight1337 added the installation Installation problems label May 31, 2024
@DarkLight1337
Copy link
Member

For other CUDA versions, please build vLLM from source.

@qinhuangdaoStation
Copy link

In case anyone else runs into the same trouble as I did. Here's how to make it work on Google Colab:

!pip -q install --upgrade fschat accelerate autoawq vllm
!pip install torch==2.1.0+cu121 torchvision==0.16.0+cu121 torchaudio==2.1.0 torchtext==0.16.0+cpu torchdata==0.7.0 --index-url https://download.pytorch.org/whl/cu121

Excuse me, what is the version of vllm?

@DarkLight1337
Copy link
Member

In case anyone else runs into the same trouble as I did. Here's how to make it work on Google Colab:

!pip -q install --upgrade fschat accelerate autoawq vllm
!pip install torch==2.1.0+cu121 torchvision==0.16.0+cu121 torchaudio==2.1.0 torchtext==0.16.0+cpu torchdata==0.7.0 --index-url https://download.pytorch.org/whl/cu121

Excuse me, what is the version of vllm?

That is a very old version. I recommend you install the latest vLLM on a clean environment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
installation Installation problems
Projects
None yet
Development

No branches or pull requests

9 participants