You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
Does anyone know if I can display chatgpt-like streaming response in Streamlit using streamlit_chat -message?
I need something like message(streaming=True) or any other alternative for this. my code segment is as below:
from streamlit_chat import message
import streamlit as st
for i in range(len(st.session_state['generated']) - 1, -1, -1):
message(st.session_state['past'][i], is_user=True, key=str(i) + '_user')
message(st.session_state["generated"][i], key=str(i))
The text was updated successfully, but these errors were encountered:
The issue with this synchronous approach is that it takes a really long time to get a response.
If this supported streaming (for example, langchain supports streaming with on_llm_new_token callback) it would significantly increase the usability of this chat application.
For now, it's a cool idea, but without text streaming, it's just not usable in real world scenarios.
Hi,
Does anyone know if I can display chatgpt-like streaming response in Streamlit using streamlit_chat -message?
I need something like message(streaming=True) or any other alternative for this. my code segment is as below:
from streamlit_chat import message
import streamlit as st
for i in range(len(st.session_state['generated']) - 1, -1, -1):
message(st.session_state['past'][i], is_user=True, key=str(i) + '_user')
message(st.session_state["generated"][i], key=str(i))
The text was updated successfully, but these errors were encountered: