Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SSR (Server Sent Event support)/Streaming support in WebChat #4939

Closed
yogimogi opened this issue Oct 30, 2023 · 7 comments · Fixed by #5141 or #5265
Closed

SSR (Server Sent Event support)/Streaming support in WebChat #4939

yogimogi opened this issue Oct 30, 2023 · 7 comments · Fixed by #5141 or #5265
Assignees
Labels
Bot Services Required for internal Azure reporting. Do not delete. Do not change color. customer-replied-to Required for internal reporting. Do not delete. customer-reported Required for internal Azure reporting. Do not delete. feature-request Azure report label

Comments

@yogimogi
Copy link

Is your feature request related to a problem?

If we have to implement functionality like https://chat.openai.com/ where answer is rendered word (or few words) at a time instead of getting the complete answer in one shot, would it be possible to achieve it with BotFramework WebChat? If we get streamed answer from an LLM (say as SSR), which we can intercept and and send "a", "a b", "a b c", "a b c d" and so on, on the WebSocket, is there some way in which we can make WebChat render it like how we see it in https://chat.openai.com/?

Describe the suggestion or request in detail

No response

Describe alternatives you have considered

No response

Additional context

No response

@yogimogi yogimogi added Bot Services Required for internal Azure reporting. Do not delete. Do not change color. customer-reported Required for internal Azure reporting. Do not delete. feature-request Azure report label labels Oct 30, 2023
@compulim
Copy link
Contributor

We are on this. 😉

@compulim compulim added the customer-replied-to Required for internal reporting. Do not delete. label Oct 30, 2023
@compulim compulim self-assigned this Oct 30, 2023
@tyfs
Copy link

tyfs commented Jan 20, 2024

Will this Server Sent Events feature be also integrated in Microsoft Bot Framework ? We are planning to use Bot Framework as a middleware and can't find a suitable way to stream from Bot Framework to WebChat in conversation.

@yogimogi
Copy link
Author

yogimogi commented Mar 5, 2024

Do we have any timelines for this, for both, the changes in the WebChat and changes in Bot Framework? Also for the latter, which SDK will have these changes?

@yogimogi
Copy link
Author

yogimogi commented Mar 8, 2024

Hello William,

We are experimenting with this on the web-chat. First of all, for streaming on the Chatbot web-service side which is based on bot-framework Python SDK, we make a chat completion call to OpenAI, it returns word by word (streaming) response, we plan to push it over web-socket as a bot framework Activity possibly with some custom payload which would indicate start of streaming and end of streaming.

  • We experimented by sending the payload as attachment in the activity, but on webhcat side, attachmentMiddleware is getting called multiple times even when we send it a single word.
  • Other thing we tried was, use same activity-id for sending message which are supposed to be part of one streaming response. So If streamed response is "a", then "b", then "c". We first send activity which only has "a", then has "a b", then "a b c", but all these activity objects will have same activity-id. this seems be to working.

Do you have any suggestions on how we should go about implementing this?

@HesselWellema
Copy link

@yogimogi I have a working solution in production that 'streams' packages of 20 karakters, The reason is that dirrectline is a premium channel and sending tokens to webchat will just be too expensive in the end.
I really hope that they look into something like https://github.com/Azure-Samples/ai-chat-app-protocol
There are already some great samples available but none of them is feasible if you want more than just ask a question to a vector database

@tez-sha
Copy link

tez-sha commented Mar 12, 2024

@compulim as @yogimogi mentioned, on getting a response from OpenAI, we call send_activity for the first time, followed by an update_activity api call. For teams, it works fine but for webchat it gives (BadArgument) This channel does not support this operation Any reason why? How can we do it otherwise? for now, can you suggest some workaround to get this working?

@compulim
Copy link
Contributor

I am adding a LIVESTREAMING.md in the PR #5265. The document outlines how to send livestreaming from the bot code and also bot/channel/client supportability.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bot Services Required for internal Azure reporting. Do not delete. Do not change color. customer-replied-to Required for internal reporting. Do not delete. customer-reported Required for internal Azure reporting. Do not delete. feature-request Azure report label
Projects
None yet
5 participants