You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was using faster-whisper and the downloaded model utilized relative symlink (to avoid duplications I suppose) but the webui (or the ctranslate2) doesn't like it:
Traceback (most recent call last):
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 565, in _run_script
exec(code, module.__dict__)
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\subsai\webui.py", line 545, in <module>
run()
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\subsai\webui.py", line 538, in run
webui()
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\subsai\webui.py", line 318, in webui
subs = _transcribe(file_path, stt_model_name, model_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 194, in wrapper
return cached_func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 223, in __call__
return self._get_or_create_cached_value(args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 248, in _get_or_create_cached_value
return self._handle_cache_miss(cache, value_key, func_args, func_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 302, in _handle_cache_miss
computed_value = self._info.func(*func_args, **func_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\subsai\webui.py", line 189, in _transcribe
model = subs_ai.create_model(model_name, model_config=model_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\subsai\main.py", line 96, in create_model
return AVAILABLE_MODELS[model_name]['class'](model_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\subsai\models\faster_whisper_model.py", line 240, in __init__
self.model = WhisperModel(model_size_or_path=self._model_size_or_path,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\LLM\Subs-AI\venv\Lib\site-packages\faster_whisper\transcribe.py", line 120, in __init__
self.model = ctranslate2.models.Whisper(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Unable to open file 'model.bin' in model 'C:\Users\Victor\.cache\huggingface\hub\models--guillaumekln--faster-whisper-large-v2\snapshots\f541c54c566e32dc1fbce16f98df699208837e8b'
models--guillaumekln--faster-whisper-large-v2\snapshots\f541c54c566e32dc1fbce16f98df699208837e8b is a folder that contains model files, those files are symlinks to the actual files at models--guillaumekln--faster-whisper-large-v2\blobs folder.
If I copy those files over and rename them, this error goes away. And the rest works flawlessly.
The text was updated successfully, but these errors were encountered:
I was using faster-whisper and the downloaded model utilized relative symlink (to avoid duplications I suppose) but the webui (or the ctranslate2) doesn't like it:
models--guillaumekln--faster-whisper-large-v2\snapshots\f541c54c566e32dc1fbce16f98df699208837e8b
is a folder that contains model files, those files are symlinks to the actual files atmodels--guillaumekln--faster-whisper-large-v2\blobs
folder.If I copy those files over and rename them, this error goes away. And the rest works flawlessly.
The text was updated successfully, but these errors were encountered: