Making /v1beta/chat/completions
streaming output compatible with openai
#1076
Labels
enhancement
New feature or request
Please describe the feature you want
For now
/v1beta/chat/completions
generate streaming outputs as below (streamming json lines):We'd like to make the response format compatible with openai's
text/event-stream
streaming response.Additional context
Discuss in slack: https://tabbyml.slack.com/archives/C05CWLZ0Y85/p1701451409878009
Code Location: https://github.com/TabbyML/tabby/blob/main/crates/tabby/src/routes/chat.rs#L39
llama.cpp's server example on text/event-stream: https://github.com/ggerganov/llama.cpp/blob/master/examples/server/server.cpp#L2775
The text was updated successfully, but these errors were encountered: