Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Llama.cpp + Intel Arc A770, RuntimeError: PyTorch is not linked with support for opencl devices #12086

Open
compellingbytes opened this issue Sep 16, 2024 · 1 comment

Comments

@compellingbytes
Copy link

Hi,
I'm trying to get llama.cpp up and running on Ubuntu 24.04 (kernel 6.8.0-44 generic) with Ipex-llm, and it seems I can't select opencl as the api to run the model on. I'm not sure what option to choose, considering I thought the A770 was an openCL device.

Here's the readout:

Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████████████████████| 2/2 [00:00<00:00, 15.93it/s]
2024-09-15 00:56:22,587 - INFO - Converting the current model to sym_int4 format......
Traceback (most recent call last):
  File "/home/cbytes/demo.py", line 11, in <module>
    model = model.to('opencl')
            ^^^^^^^^^^^^^^^^^^
  File "/home/cbytes/miniforge3/envs/llm/lib/python3.11/site-packages/transformers/modeling_utils.py", line 2905, in to
    return super().to(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/cbytes/miniforge3/envs/llm/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1174, in to
    return self._apply(convert)
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/cbytes/miniforge3/envs/llm/lib/python3.11/site-packages/torch/nn/modules/module.py", line 780, in _apply
    module._apply(fn)
  File "/home/cbytes/miniforge3/envs/llm/lib/python3.11/site-packages/torch/nn/modules/module.py", line 780, in _apply
    module._apply(fn)
  File "/home/cbytes/miniforge3/envs/llm/lib/python3.11/site-packages/torch/nn/modules/module.py", line 805, in _apply
    param_applied = fn(param)
                    ^^^^^^^^^
  File "/home/cbytes/miniforge3/envs/llm/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1160, in convert
    return t.to(
           ^^^^^
RuntimeError: PyTorch is not linked with support for opencl devices
@qiuxin2012
Copy link
Contributor

It's a xpu device, you can use model.to('xpu').

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants