Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v0.13.3镜像启动报错 #1959

Closed
1 of 3 tasks
shatang123 opened this issue Jul 29, 2024 · 7 comments · Fixed by #1974
Closed
1 of 3 tasks

v0.13.3镜像启动报错 #1959

shatang123 opened this issue Jul 29, 2024 · 7 comments · Fixed by #1974
Labels
duplicate This issue or pull request already exists gpu
Milestone

Comments

@shatang123
Copy link

System Info / 系統信息

cuda:12.2

Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?

  • docker / docker
  • pip install / 通过 pip install 安装
  • installation from source / 从源码安装

Version info / 版本信息

版本:v0.13.3

The command used to start Xinference / 用以启动 xinference 的命令

docker run -d
-v /data/xinference/:/root/.xinference
-v /data/model/:/data/model
-e XINFERENCE_MODEL_SRC=modelscope
-e XINFERENCE_HOME=/data/model
-p 9997:9997
--gpus all
registry.cn-hangzhou.aliyuncs.com/xprobe_xinference/xinference:v0.13.3
xinference-local -H 0.0.0.0

Reproduction / 复现过程

Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py", line 75, in _load_shared_library
return ctypes.CDLL(str(_lib_path), **cdll_args) # type: ignore
File "/usr/lib/python3.10/ctypes/init.py", line 374, in init
self._handle = _dlopen(self._name, mode)
OSError: /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by /usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/bin/xinference-local", line 5, in
from xinference.deploy.cmdline import local
File "/usr/local/lib/python3.10/dist-packages/xinference/init.py", line 37, in
_install()
File "/usr/local/lib/python3.10/dist-packages/xinference/init.py", line 34, in _install
install_model()
File "/usr/local/lib/python3.10/dist-packages/xinference/model/init.py", line 17, in _install
from .llm import _install as llm_install
File "/usr/local/lib/python3.10/dist-packages/xinference/model/llm/init.py", line 20, in
from .core import (
File "/usr/local/lib/python3.10/dist-packages/xinference/model/llm/core.py", line 26, in
from ...types import PeftModelConfig
File "/usr/local/lib/python3.10/dist-packages/xinference/types.py", line 399, in
from llama_cpp import Llama
File "/usr/local/lib/python3.10/dist-packages/llama_cpp/init.py", line 1, in
from .llama_cpp import *
File "/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py", line 88, in
_lib = _load_shared_library(_lib_base_name)
File "/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py", line 77, in _load_shared_library
raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so': /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by /usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so)
image

Expected behavior / 期待表现

正常启动

@XprobeBot XprobeBot added the gpu label Jul 29, 2024
@XprobeBot XprobeBot added this to the v0.13.4 milestone Jul 29, 2024
@wjhmiss
Copy link

wjhmiss commented Jul 29, 2024

今天我也是,v0.13.2不能下载model,v0.13.3也不能用

@RandyChen1985
Copy link

我的也是
image

@ConleyKong
Copy link

相同问题,求解决……

@qinxuye
Copy link
Contributor

qinxuye commented Jul 29, 2024

用 0.13.2 的镜像,在内部 pip install xinference==0.13.3 可以吗?

@zhanghx0905
Copy link
Contributor

可以暂时将启动指令改为sh -c "pip uninstall -y llama-cpp-python && xinference-local --host 0.0.0.0 --port 8080"

@qinxuye
Copy link
Contributor

qinxuye commented Jul 29, 2024

#1949 重复。先关闭这个。

@qinxuye qinxuye closed this as completed Jul 29, 2024
@qinxuye qinxuye added the duplicate This issue or pull request already exists label Jul 29, 2024
@John-ken
Copy link

可以暂时将启动指令改为sh -c "pip uninstall -y llama-cpp-python && xinference-local --host 0.0.0.0 --port 8080"

xinference-local:未找到命令

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
duplicate This issue or pull request already exists gpu
Projects
None yet
Development

Successfully merging a pull request may close this issue.

8 participants