Support streaming responses from LLM #670
talrejanikhil
started this conversation in
Ideas
Replies: 2 comments 1 reply
-
It's an interesting one. There is an impression that AI is talking to you which is why you get a streamed response but a db response can be instantaneous and do you want your user to wait as it streams the top 3 sales etc |
Beta Was this translation helpful? Give feedback.
0 replies
-
Actually streaming makes the user wait less. They will see the SQL appearing immediately (token by token) rather waiting for the whole SQL to be generated by the LLM |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
There are some LLMs like Open AI where the response can be streamed, so that the user does not have to wait for the full response to be available.
https://platform.openai.com/docs/api-reference/streaming
Is it possible to add this option to Vanna?
Beta Was this translation helpful? Give feedback.
All reactions