-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
got error when using openvino #1394
Comments
To use OpenVINO, you'll need two models: the original whisper ggml model and the OpenVINO-converted model. Make sure to place both models in the same directory and provide the path of the original whisper ggml model when you run the main program. |
@jackleibest what machine do you use? is it inside docker? |
centos 7.9 without docker |
Do you have the OpenVINO toolkit installed on your machine? |
yes, following the instructions:
|
I have exact same issue using Ubuntu 22.04 in WSL2 on Windows 11. I have a laptop Gen11 CPU with Gen12 GPU and openVino installed ~/whisper.cpp$ ./build/bin/main -m models/ggml-base.en-encoder-openvino.bin -f samples/jfk.wav |
I found solution here: [https://github.com//pull/1694#issuecomment-1870984510]
working command is: I think this can be closed |
I also observed the same issue. However if the model path only contains ggml-base.bin it works the same way. Moreover, running benchmark is almost identical to inference without accelerators. |
when i run the Command i get mbind failed: Invalid argument very often in the Whisper Output Command:
Whisper Output:whisper_init_from_file_with_params_no_state: loading model from 'models/ggml-large-v3.bin' system_info: n_threads = 4 / 20 | AVX = 1 | AVX2 = 1 | AVX512 = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | METAL = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | CUDA = 0 | COREML = 0 | OPENVINO = 1 main: processing '/tmp/test_file.wav' (4612180 samples, 288.3 sec), 4 threads, 1 processors, 5 beams + best of 5, lang = de, task = transcribe, timestamps = 1 ... mbind failed: Invalid argument |
according to the manual, i just wanna speed up inference on the CPU via OpenVINO, however got the problem as bellow.
(openvino_conv_env) [root@zaozhuang3L-C6-35 whisper.cpp]# ./main -m models/ggml-base.en-encoder-openvino.bin -f samples/jfk.wav
whisper_init_from_file_no_state: loading model from 'models/ggml-base.en-encoder-openvino.bin'
whisper_model_load: loading model
whisper_model_load: invalid model data (bad magic)
whisper_init_no_state: failed to load model
error: failed to initialize whisper context
The text was updated successfully, but these errors were encountered: