Skip to content
Dominik Jain edited this page May 13, 2024 · 1 revision

Torch with CUDA Support

By default, installing torch via Poetry will use a torch version that was built against a default version of CUDA, and the version of CUDA it uses depends on the torch version. For example, torch 2.0 might have used CUDA 11 and later versions might now use CUDA 12 by default.

So to get CUDA support on a machine that does match the default CUDA version, we have to tell Poetry where to get the correct torch build from from instead: Specify a source which uses builds that support the desired version of CUDA,

[[tool.poetry.source]]
name = "pytorch-cuda"
url = "https://download.pytorch.org/whl/cu118"
priority = "supplemental"

and then use that source in the dependency specification,

torch = [
    {markers="sys.platform != 'darwin'", version="2.1.1", source="pytorch-cuda"},
    {markers="sys.platform == 'darwin'", version="2.1.1", source="pypi"}
]

where we have additionally used markers that will use a non-CUDA version for MacOS.

Clone this wiki locally