Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

/llms/stream endpoint should be text/event-stream type #147

Closed
borgmon opened this issue Aug 2, 2023 · 3 comments
Closed

/llms/stream endpoint should be text/event-stream type #147

borgmon opened this issue Aug 2, 2023 · 3 comments

Comments

@borgmon
Copy link
Contributor

borgmon commented Aug 2, 2023

The content type of /llms/stream should be text/event-stream. This align with the response from OpenAI endpoint as well.

Actually, this is already causing issue with cloudflared.

When deployed with cloudflared (cloudflare tunnel), the text will be buffered until the stream ends.
It seems like setting Content-Type to will fix the issue.

We can simply change this line to fix it.

@borgmon
Copy link
Contributor Author

borgmon commented Aug 2, 2023

I already created a PR 👍

@enricoros
Copy link
Owner

Merged #148, thanks. Hopefully this won't cause problems with other deployment providers.

Technically the output of the /llms/stream API is a progressive text (e.g. token by token, or around 3 charts at a time) and not an event stream (a very special encoded text with new lines and all), but if this workaround works then the correctness doesn't matter much.

Thanks for providing this!

@enricoros
Copy link
Owner

Closing this ticket. Not sure this is the correct technical solution, as we stream plain text and not 'events' (from our backend (edge functions) to our UI). But so far there seem to be no issues.

NOTE: the vercel/ai SDK uses text/plain, here: https://github.com/vercel-labs/ai/blob/980004b512066dda3dfe49ff1432cc72b8c08450/packages/core/streams/streaming-text-response.ts#L28C38-L28C38 -- while we now use 'text/event-stream' -- again, hopefully this diverge is no issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants