-
Notifications
You must be signed in to change notification settings - Fork 426
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v0.13.3镜像启动报错 #1959
Comments
今天我也是,v0.13.2不能下载model,v0.13.3也不能用 |
相同问题,求解决…… |
用 0.13.2 的镜像,在内部 pip install xinference==0.13.3 可以吗? |
可以暂时将启动指令改为sh -c "pip uninstall -y llama-cpp-python && xinference-local --host 0.0.0.0 --port 8080" |
和 #1949 重复。先关闭这个。 |
xinference-local:未找到命令 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
System Info / 系統信息
cuda:12.2
Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?
Version info / 版本信息
版本:v0.13.3
The command used to start Xinference / 用以启动 xinference 的命令
docker run -d
-v /data/xinference/:/root/.xinference
-v /data/model/:/data/model
-e XINFERENCE_MODEL_SRC=modelscope
-e XINFERENCE_HOME=/data/model
-p 9997:9997
--gpus all
registry.cn-hangzhou.aliyuncs.com/xprobe_xinference/xinference:v0.13.3
xinference-local -H 0.0.0.0
Reproduction / 复现过程
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py", line 75, in _load_shared_library
return ctypes.CDLL(str(_lib_path), **cdll_args) # type: ignore
File "/usr/lib/python3.10/ctypes/init.py", line 374, in init
self._handle = _dlopen(self._name, mode)
OSError: /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by /usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/bin/xinference-local", line 5, in
from xinference.deploy.cmdline import local
File "/usr/local/lib/python3.10/dist-packages/xinference/init.py", line 37, in
_install()
File "/usr/local/lib/python3.10/dist-packages/xinference/init.py", line 34, in _install
install_model()
File "/usr/local/lib/python3.10/dist-packages/xinference/model/init.py", line 17, in _install
from .llm import _install as llm_install
File "/usr/local/lib/python3.10/dist-packages/xinference/model/llm/init.py", line 20, in
from .core import (
File "/usr/local/lib/python3.10/dist-packages/xinference/model/llm/core.py", line 26, in
from ...types import PeftModelConfig
File "/usr/local/lib/python3.10/dist-packages/xinference/types.py", line 399, in
from llama_cpp import Llama
File "/usr/local/lib/python3.10/dist-packages/llama_cpp/init.py", line 1, in
from .llama_cpp import *
File "/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py", line 88, in
_lib = _load_shared_library(_lib_base_name)
File "/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py", line 77, in _load_shared_library
raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so': /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by /usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so)
Expected behavior / 期待表现
正常启动
The text was updated successfully, but these errors were encountered: