-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GPU support #3
Comments
I would very much appreciate the option to run in docker on an external server with GPU support, as from my testing with whisper & an NVIDIA GPU it makes a great improvement. Could this be accomplished with a flag or environment variable perhaps? |
I've got my instance running on an old gaming pc. It has a couple of nvidia 980ti's in it (SLI FTW! lol) I've used them for all kinds of experiments but this would be specifically useful for me. |
I have it working on my home server with these changes. Run it something like this: Or with docker-compose. If you use my fork, you can run it with docker-compose up (and it will build the changed docker image for you) |
@pierrewessman can you please give us insights on how much of an improvement that is? |
@pierrewessman can you please help me to get your change running? I don't know what i have to do with your command to make it get the correct image |
I guess it depends a lot, but for me its a big difference. I run the largest available whisper model under 2 seconds for each request. |
First, are you running it as a docker image? I am using my fork linked above and I run it with docker-compose. |
@pierrewessman thanks for the quick answer! I am a bit of a docker noob so sorry for the maybe dumb question. This is the docker-config i just scrambled together, is this correct? It doesn't appear to be any faster than using CPU right now. version: '3.6' When i run |
Also i can run nvidia-smi inside the new whisper container but it doesn't show any process when i use it but the logs show that it gets the audio from home assistant |
Here is my example docker-compose file.. Looking at yours it seems like you are still using the original one without GPU support. You need to have all files from here and then you can either build the docker file manually or make use of the docker-compose file (note the row with "build: ." that says to docker-compose to build from the local Dockerfile) |
oh okay so i have to download the dockerfile and put it somewhere and then enter the path where it says build: ? i never did this manual process. always used just the images that are put there by the people who wrote the docker-compose |
oh wait i think i made it! |
This is incredibly fast! Thank you so much @pierrewessman for this 1) awesome work and 2) your quick help! GTX1660 runs the medium-int8 model so incredibly quick!! |
hi~ @pierrewessman Can this be run on m series mac? |
Hey, my "fix" only supports acceleration on cuda and I believe faster-whisper (the implementation of whisper that is used, does not take advantage of mc m-series. |
Would be great to have GPU support in case you run Home Assistant on something other than a raspberry Pi, like KVM with GPU passtrough. Or in case that you want to use the docker image on an external server.
I imagine that we just need to change the base image from
FROM debian:bullseye-slim
toFROM nvidia/cuda:12.1.1-cudnn8-runtime-ubuntu22.04
The text was updated successfully, but these errors were encountered: