-
-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add GPU tag to readme #8
Conversation
I am a bot, here are the test results for this PR:
|
I tested the following compose service on my server:
It functions, but I noticed immediately that it was running on CPU instead of GPU like it's supposed to. Note that I am running a similar stack from LinuxServer.io for faster-whisper which does use the GPU correctly. After investigating, I found that piper is expecting "--use-cuda", NOT "--cuda" like these changes propose. After fixing that, I get an error:
And after further debugging, it appears the empty string comes from piper throwing a segmentation fault upon starting. Not sure how to continue debugging from here, but it looks like GPU is not quite ready to work with this container |
Unfortunately the piper binary silently ignores invalid CLI args, which makes testing it reliably extremely challenging. |
Honestly the whole piper project seems to be kind of abandoned. No releases since 2023, few commits since then, issues and PRs largely ignored, readme doesn't even list the correct args for CUDA use. All my tests at the moment are resulting in segfaults but I'll see if I can make any progress. |
It's not our image as it also segfaults in the nvidia pytorch container. |
Description:
Benefits of this PR and context:
How Has This Been Tested?
Source / References: