Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] When using LobeChat, I encountered a bug regarding the connectivity check for locally proxied Ollama to the internet. #4413

Open
fdg157 opened this issue Oct 19, 2024 · 2 comments
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models

Comments

@fdg157
Copy link

fdg157 commented Oct 19, 2024

📦 部署环境

Docker

📌 软件版本

1.22.8

💻 系统环境

Other Linux

🌐 浏览器

Firefox

🐛 问题描述

When access is restricted to cloud server IPs, the frontend webpage will fail the check when initiating a query, regardless of whether the client request mode is used.
ai
com

📷 复现步骤

  1. Set up local Ollama proxying to the internet, allowing access only from cloud server IPs (Cloudflare WAF).
  2. Initiate a query from the frontend webpage.
  3. The check will fail regardless of whether the client request mode is used.

🚦 期望结果

The frontend webpage should be able to pass the connectivity check normally when initiating the check and call the local model session.

📝 补充信息

No response

@fdg157 fdg157 added the 🐛 Bug Something isn't working | 缺陷 label Oct 19, 2024
@lobehubbot
Copy link
Member

👀 @fdg157

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@dosubot dosubot bot added the ollama Relative to Ollama Provider and ollama models label Oct 19, 2024
Copy link

dosubot bot commented Oct 19, 2024

@fdg157 你好!我是Dosu,一个可以帮助您解决问题、回答疑问并成为贡献者的机器人。我会在等待人类维护者的过程中帮助您。

To address the connectivity check failure in LobeChat when using a locally proxied Ollama service with restricted IP access, you need to configure Ollama for cross-origin access. This involves setting the OLLAMA_ORIGINS environment variable to allow access from any origin. Here are the steps based on your operating system:

On macOS:

  1. Set the environment variable using launchctl:
    launchctl setenv OLLAMA_ORIGINS "*"
  2. Restart the Ollama application.

On Windows:

  1. Exit the Ollama program from the taskbar.
  2. Edit system environment variables from the Control Panel.
  3. Create or edit the OLLAMA_ORIGINS environment variable for your user account, setting the value to *.
  4. Save the changes and restart the system.
  5. Run Ollama again.

On Linux:

  1. Edit the systemd service:
    sudo systemctl edit ollama.service
  2. Add the following under [Service]:
    [Service]
    Environment="OLLAMA_HOST=0.0.0.0"
    Environment="OLLAMA_ORIGINS=*"
  3. Save and exit.
  4. Reload systemd and restart Ollama:
    sudo systemctl daemon-reload
    sudo systemctl restart ollama

Using Docker:

  1. Run the Docker container with the environment variable:
    docker run -d --gpus=all -v ollama:/root/.ollama -e OLLAMA_ORIGINS="*" -p 11434:11434 --name ollama ollama/ollama

Additionally, ensure that the DEFAULT_BASE_URL in your LobeChat configuration is set to http://127.0.0.1:11434, as this is the default host used by the Ollama service. If you have customized the base URL, make sure it matches the configuration in your Ollama service setup. These steps should help resolve the connectivity issue by allowing cross-origin requests to the Ollama service [1][2].

To continue talking to Dosu, mention @dosu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

2 participants