Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement LLM response streaming in /ask #863

Open
dlqqq opened this issue Jun 27, 2024 · 0 comments
Open

Implement LLM response streaming in /ask #863

dlqqq opened this issue Jun 27, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@dlqqq
Copy link
Member

dlqqq commented Jun 27, 2024

Problem

#859 introduced LLM response streaming, but only for the default chat handler (i.e. the one used when no slash command is specified). Ideally, /ask should also stream responses whenever possible.

Proposed Solution

Implement LLM response streaming for /ask.

Additional context

See #859

@dlqqq dlqqq added the enhancement New feature or request label Jun 27, 2024
@dlqqq dlqqq added this to the v2.19.0 milestone Jun 27, 2024
@dlqqq dlqqq removed this from the v2.19.0 milestone Aug 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant