Skip to content

GPU instead CPU? #214

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Whhhatttt opened this issue Mar 16, 2023 · 5 comments
Closed

GPU instead CPU? #214

Whhhatttt opened this issue Mar 16, 2023 · 5 comments

Comments

@Whhhatttt
Copy link

How can we use GPU instead of CPU? My processor is pretty weak. I don't have a macbook or a very powerful pc. the desire to run a model on CUDA cores. Thanks

@j-f1
Copy link
Collaborator

j-f1 commented Mar 16, 2023

You might have better luck with the original LLaMA repo which I believe supports CUDA. This repo is focused on running on the CPU.

@j-f1 j-f1 closed this as not planned Won't fix, can't repro, duplicate, stale Mar 16, 2023
@Whhhatttt
Copy link
Author

Whhhatttt commented Mar 16, 2023

You might have better luck with the original LLaMA repo which I believe supports CUDA. This repo is focused on running on the CPU.

Where i can find original repo?

@j-f1
Copy link
Collaborator

j-f1 commented Mar 17, 2023

@dontknowhy
Copy link

Wait,i haven't any CUDA devices...
So What about OpenCL or Vulkan(I can use them on my device)

@Whhhatttt
Copy link
Author

Maybe we can use openCL. But with cuda its not work good. I have very low pc

Deadsg pushed a commit to Deadsg/llama.cpp that referenced this issue Dec 19, 2023
…terial-9.1.12

Bump mkdocs-material from 9.1.11 to 9.1.12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants