Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Starting tabbyml behind nginx reverse proxy requires websocket support #3289

Closed
oli-ver opened this issue Oct 20, 2024 · 0 comments · Fixed by #3291
Closed

Starting tabbyml behind nginx reverse proxy requires websocket support #3289

oli-ver opened this issue Oct 20, 2024 · 0 comments · Fixed by #3291

Comments

@oli-ver
Copy link
Contributor

oli-ver commented Oct 20, 2024

Describe the bug
Running any prompts in the chatbox on the web page of Tabby or in the VSCode plugin result in the following error message:

{
  "error": true,
  "message": "[Network] undefined"
}

Information about your version
Please provide output of tabby --version

./tabby --version
tabby 0.18.0

Information about your GPU
Please provide output of nvidia-smi

Only CPU mode (linux virtual server)

Additional context
Add any other context about the problem here.

Running tabby natively or in Docker on Debian GNU/Linux 11 (bullseye). No GPU available on the system.

Installation steps native:

useradd -m -s /usr/bin/zsh tabby
su - tabby
wget https://github.com/TabbyML/tabby/releases/download/v0.18.0/tabby_x86_64-manylinux2014.zip
unzip tabby_x86_64-manylinux2014.zip
mv dist/* ~
chmod +x tabby llama-server
TABBY_WEBSERVER_JWT_TOKEN_SECRET=mysecret ./tabby serve --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct --port 8090 --host 127.0.0.1 --device cpu

Installation steps docker:

services:
  tabby:
    restart: always
    image: registry.tabbyml.com/tabbyml/tabby
    environment:
     - ENDPOINT=https://my-endpoint.com
     - TABBY_WEBSERVER_JWT_TOKEN_SECRET=mysecret
     - LD_LIBRARY_PATH=/usr/local/cuda/lib64:/usr/local/cuda/compat:$ #workaround cuda errors, see https://github.com/TabbyML/tabby/issues/2634#issuecomment-2244530283
    entrypoint: /opt/tabby/bin/tabby-cpu
    command: serve --device cpu --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct
    volumes:
      - "./data:/data"
    ports:
      - 127.0.0.1:8090:8080

The only configuration I made was to create a first admin and configure the endpoint URL of my server. I am running zabby behind a nginx reverse proxy with SSL configuration. Both setups natively or in docker result in the same error.

While writing this bug report I found the resolution as well. I tried to run tabby without reverse proxy and it worked. It seemed to be related to my nginx reverse proxy configuration. Initially it was:

        location / {
            proxy_pass       http://localhost:8090;
            proxy_set_header Host $host;
            proxy_set_header X-Real-IP $remote_addr;
        }

Following the instructions found here I changed it to:

        location / {
            proxy_pass       http://localhost:8090;
            proxy_set_header Host $host;
            proxy_set_header X-Real-IP $remote_addr;
            proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
            proxy_http_version 1.1;
            proxy_set_header Upgrade $http_upgrade;
            proxy_set_header Connection "upgrade";
        }

With this configuration it works.

Sharing this experience now more for other users experiencing the same problem.

I will create a pull request for an documentation update both for running behind reverse proxy and how to run in docker without GPU.

@wsxiaoys wsxiaoys changed the title Starting tabbyml behind nginx reverse proxy results in "[Network] undefined" error message Starting tabbyml behind nginx reverse proxy requires websocket support Oct 21, 2024
@TabbyML TabbyML locked and limited conversation to collaborators Oct 21, 2024
@wsxiaoys wsxiaoys converted this issue into discussion #3292 Oct 21, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants