Display spinning/loading animation while an LLM response message is still streaming #199
Labels
area: frontend
enhancement
New feature or request
good first issue
Good for newcomers
help wanted
Looking for help from anyone!
Problem
Currently, the UI makes it impossible to tell if a model has finished streaming its response back to the user, or if it is still underway and is just taking a long time to calculate the response. This is especially prevalent on slower machines without AVX-512 or CUDA, and even more of a problem when the message being returned by the model is very very long.
Proposed fix
To improve the user experience, a small spinning loading wheel (or some other kind of animation) should be added either beneath or next to the latest message in the chat view, which will spin or animate in some way that makes it obvious that the model is still generating a streaming a response. Then, once the message streaming has completed, the spinning animation would end and set itself to non-visible.
Additional improvements
Alternatively, that spinning animation could change to a tiny checkmark icon or something similar, in order to indicate that the message had been fully streamed from the model.
In addition, this small loading animation could also be displayed while a model is being loaded in the background, which again can take quite a long time (20+ seconds) on slower machines. This would help inform the user that something is indeed still happening in the background, rather than leaving the user confused while the UI does nothing for dozens of seconds.
Implementation starter ideas
The wasmedge frontend/backend does make it obvious when a message has completed being streamed. You can observe this while looking at the console log:
The last statement indicates that the stream has ended. Sometimes you may also observer other forms of stream ending, most of which are currently handled at the Rust level via the
StopReason
enum. Search for bothStopReason
andfinish_reason
to help you get started.The text was updated successfully, but these errors were encountered: