-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SSR (Server Sent Event support)/Streaming support in WebChat #4939
Comments
We are on this. 😉 |
Will this Server Sent Events feature be also integrated in Microsoft Bot Framework ? We are planning to use Bot Framework as a middleware and can't find a suitable way to stream from Bot Framework to WebChat in conversation. |
Do we have any timelines for this, for both, the changes in the WebChat and changes in Bot Framework? Also for the latter, which SDK will have these changes? |
Hello William, We are experimenting with this on the web-chat. First of all, for streaming on the Chatbot web-service side which is based on bot-framework Python SDK, we make a chat completion call to OpenAI, it returns word by word (streaming) response, we plan to push it over web-socket as a bot framework Activity possibly with some custom payload which would indicate start of streaming and end of streaming.
Do you have any suggestions on how we should go about implementing this? |
@yogimogi I have a working solution in production that 'streams' packages of 20 karakters, The reason is that dirrectline is a premium channel and sending tokens to webchat will just be too expensive in the end. |
@compulim as @yogimogi mentioned, on getting a response from OpenAI, we call send_activity for the first time, followed by an update_activity api call. For teams, it works fine but for webchat it gives (BadArgument) This channel does not support this operation Any reason why? How can we do it otherwise? for now, can you suggest some workaround to get this working? |
I am adding a |
Is your feature request related to a problem?
If we have to implement functionality like https://chat.openai.com/ where answer is rendered word (or few words) at a time instead of getting the complete answer in one shot, would it be possible to achieve it with BotFramework WebChat? If we get streamed answer from an LLM (say as SSR), which we can intercept and and send "a", "a b", "a b c", "a b c d" and so on, on the WebSocket, is there some way in which we can make WebChat render it like how we see it in https://chat.openai.com/?
Describe the suggestion or request in detail
No response
Describe alternatives you have considered
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: