Can llama.cpp and whisper.cpp share the same ggml.dll? #10278
-
Hello everyone, I was wondering whether llama.cpp and whisper.cpp can share the same ggml.dll file? If yes, should this ggml.dll be for the same runtimes (i.e. llama.cpp built for cuda-cu12.2.0-x64 and whisper.cpp built for cuda-cu12.2.0-x64; notice both of them are built for cuda-cu12.2.0-x64)? I noticed that the ggml.dll file differs in size between llama.cpp and whisper.cpp, so hence my question. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
It would be possible if both llama.cpp and whisper.cpp are built with the same version of ggml. There have been some changes to ggml recently that I believe have not yet landed on whisper.cpp, so to do that at this moment you would need to go back to an older version of llama.cpp. |
Beta Was this translation helpful? Give feedback.
It would be possible if both llama.cpp and whisper.cpp are built with the same version of ggml. There have been some changes to ggml recently that I believe have not yet landed on whisper.cpp, so to do that at this moment you would need to go back to an older version of llama.cpp.