Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] 使用cloudflare tunnel 部署时无法流式回应 #781

Closed
wenwen12345 opened this issue Dec 23, 2023 · 5 comments
Closed

[Question] 使用cloudflare tunnel 部署时无法流式回应 #781

wenwen12345 opened this issue Dec 23, 2023 · 5 comments
Labels
😇 Help Wanted Need help | 需要帮助

Comments

@wenwen12345
Copy link

🧐 问题描述 | Proposed Solution

如题,无法在使用 cloudflare tunnel 部署时使用流式回应

📝 补充信息 | Additional Information

No response

@wenwen12345 wenwen12345 added the 😇 Help Wanted Need help | 需要帮助 label Dec 23, 2023
@lobehubbot
Copy link
Member

👀 @wenwen12345

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@arvinxx
Copy link
Contributor

arvinxx commented Dec 28, 2023

看看这个是不是相关问题: #540

@wenwen12345
Copy link
Author

oneapi 没有上https 只是 把lobechat暴露了出去

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


oneapi does not go to https, it just exposes lobechat.

@Evsio0n
Copy link

Evsio0n commented Jan 11, 2024

#540 无关。
参考 cloudflare/cloudflared#199 (comment)
需要加上 text/event-stream header cloudflare才会no proxy buffer

改写nginx反代:

    location /api/openai/chat {
        proxy_pass http://your_backend_service;

        # 添加或更改响应头
+++        add_header Content-Type text/event-stream;

        # 你通常需要关闭缓冲
        proxy_buffering off;

        #其他配置,如websocketset header 等等,
        ......
    }

kubernetes nginx-ingress示例

kind: Ingress
apiVersion: networking.k8s.io/v1
metadata:
  name: lobe-chat-ingress-lsoz4j
  namespace: lobe-chat
  labels:
    app.kubernetes.io/name: lobe-chat
    app.kubernetes.io/version: v1
  annotations:
    nginx.ingress.kubernetes.io/proxy-buffering: 'off'
+++    nginx.ingress.kubernetes.io/configuration-snippet: |
+++      if ($uri ~* "^/api/openai/chat") {
+++        more_set_headers "Content-Type: text/event-stream";
+++      }
...

@lobehub lobehub locked and limited conversation to collaborators Jan 11, 2024
@arvinxx arvinxx converted this issue into discussion #1016 Jan 11, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
😇 Help Wanted Need help | 需要帮助
Projects
None yet
Development

No branches or pull requests

4 participants