Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v0.13.3 Docker 镜像无法拉起 #1949

Closed
1 of 3 tasks
eric1932 opened this issue Jul 26, 2024 · 4 comments · Fixed by #1974
Closed
1 of 3 tasks

v0.13.3 Docker 镜像无法拉起 #1949

eric1932 opened this issue Jul 26, 2024 · 4 comments · Fixed by #1974
Labels
Milestone

Comments

@eric1932
Copy link

eric1932 commented Jul 26, 2024

System Info / 系統信息

docker run --gpus all xprobe/xinference xinference-local

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py", line 75, in _load_shared_library
    return ctypes.CDLL(str(_lib_path), **cdll_args)  # type: ignore
  File "/usr/lib/python3.10/ctypes/__init__.py", line 374, in __init__
    self._handle = _dlopen(self._name, mode)
OSError: libcuda.so.1: cannot open shared object file: No such file or directory

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/xinference-local", line 5, in <module>
    from xinference.deploy.cmdline import local
  File "/usr/local/lib/python3.10/dist-packages/xinference/__init__.py", line 37, in <module>
    _install()
  File "/usr/local/lib/python3.10/dist-packages/xinference/__init__.py", line 34, in _install
    install_model()
  File "/usr/local/lib/python3.10/dist-packages/xinference/model/__init__.py", line 17, in _install
    from .llm import _install as llm_install
  File "/usr/local/lib/python3.10/dist-packages/xinference/model/llm/__init__.py", line 20, in <module>
    from .core import (
  File "/usr/local/lib/python3.10/dist-packages/xinference/model/llm/core.py", line 26, in <module>
    from ...types import PeftModelConfig
  File "/usr/local/lib/python3.10/dist-packages/xinference/types.py", line 399, in <module>
    from llama_cpp import Llama
  File "/usr/local/lib/python3.10/dist-packages/llama_cpp/__init__.py", line 1, in <module>
    from .llama_cpp import *
  File "/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py", line 88, in <module>
    _lib = _load_shared_library(_lib_base_name)
  File "/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py", line 77, in _load_shared_library
    raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so': libcuda.so.1: cannot open shared object file: No such file or directory

Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?

  • docker / docker
  • pip install / 通过 pip install 安装
  • installation from source / 从源码安装

Version info / 版本信息

v0.13.3

The command used to start Xinference / 用以启动 xinference 的命令

docker run --gpus all xprobe/xinference xinference-local

Reproduction / 复现过程

不需要额外操作

Expected behavior / 期待表现

容器不退出

@XprobeBot XprobeBot added the gpu label Jul 26, 2024
@XprobeBot XprobeBot added this to the v0.13.4 milestone Jul 26, 2024
@eric1932 eric1932 reopened this Jul 26, 2024
@eric1932
Copy link
Author

也测试了 docker run --gpus all xprobe/xinference:v0.13.2 xinference-local --log-level debug 就可以正常运行

@zhanghx0905
Copy link
Contributor

可以暂时将启动指令改为sh -c "pip uninstall -y llama-cpp-python && xinference-local --host 0.0.0.0 --port 8080"

@qinxuye qinxuye mentioned this issue Jul 29, 2024
3 tasks
@ConleyKong
Copy link

按照官方建议将内部的llama-cpp-python 改为 0.2.28 可以正常启动了

@Lakr233
Copy link

Lakr233 commented Nov 1, 2024

这个处理看起来有点问题 libcuda 没了 GPU 就完全没法用了 删掉 llama.cpp 是个治标不治本的情况

我这边是用 export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda/compat/ 导出 libcuda 解决这个问题的

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants