Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming text responses #27

Open
sixlive opened this issue Oct 16, 2024 · 8 comments
Open

Streaming text responses #27

sixlive opened this issue Oct 16, 2024 · 8 comments
Assignees
Labels
planned Item planned on the roadmap

Comments

@sixlive
Copy link
Contributor

sixlive commented Oct 16, 2024

Description

Enabling streaming of generated text, which will be particularly useful for real-time applications and improved user experiences.

@sixlive sixlive added the enhancement New feature or request label Oct 16, 2024
@sixlive sixlive self-assigned this Oct 16, 2024
@sixlive sixlive moved this to Todo in Prism Development Oct 16, 2024
@sixlive sixlive added the planned Item planned on the roadmap label Oct 16, 2024
@petervandijck
Copy link

I would like to second that this is a must-have for even starting to use the package 👍

@sixlive sixlive changed the title Streaming Text Response Streaming text responses Oct 25, 2024
@tognee
Copy link

tognee commented Nov 18, 2024

Hi, is there a timeframe for this feature?
I could help to develop it if there is a clear definition on how it should be implemented.

The first thing that needs to be discussed is if the streaming option should be an extension of the Prism::text static function or a new Prism::streamingText function.

After that the implementation seems simple, there should be a generate function that calls the API endpoint with the streaming parameter and returns a generator. This generator should collect the data from the request and parses the SSE events coming from the provider, map each chunk to a new ChunkResponse object and yield the object one at a time.

This ChunkResponse should be following the OpenAI Chunk type or should it be something custom?

If we can get to a specification together I could start working on this feature

@tomtev
Copy link

tomtev commented Nov 23, 2024

Really need this also :)

@petervandijck
Copy link

petervandijck commented Nov 23, 2024 via email

@heychazza
Copy link

Hey @sixlive, thanks for creating this package! Do you have any idea when streaming will be a thing?

@sixlive sixlive moved this from Todo to In Progress in Prism Development Jan 19, 2025
@sixlive sixlive removed the enhancement New feature or request label Jan 20, 2025
@mafrasil
Copy link

+1 @sixlive - thanks for the package, is awesome! also eagerly looking to have stream available

@vesper8
Copy link

vesper8 commented Feb 2, 2025

I have achieved this functionality using a python library and websockets to stream to my frontend. That was ~2 years ago already.

If you're struggling to implement this fully, then as a first phase you could maybe just expose the text streaming and let us handle it as we wish, so if we want to broadcast to a websocket channel we can do that.

@tognee
Copy link

tognee commented Feb 2, 2025

@vesper8 I don't think this library is trying to implement backend to frontend communication. It needs to recieve the API stream directly from the providers (that usually is always an SSE (Server-Sent Event) stream) and map it into a standard Structured ResponseChunks that can be used with any model and provider.
Frontend communication is the next step for developers that are using the library, not for the devs of this one.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
planned Item planned on the roadmap
Projects
Status: In Progress
Development

No branches or pull requests

7 participants