Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: support server sent event #333

Merged
merged 13 commits into from
Jun 15, 2023
Merged

feat: support server sent event #333

merged 13 commits into from
Jun 15, 2023

Conversation

Signed-off-by: Keming <kemingy94@gmail.com>
@github-actions github-actions bot added the enhancement New feature or request label Apr 22, 2023
Signed-off-by: Keming <kemingy94@gmail.com>
Signed-off-by: Keming <kemingy94@gmail.com>
@kemingy kemingy marked this pull request as ready for review June 1, 2023 04:02
@kemingy kemingy requested a review from lkevinzc June 1, 2023 04:02
@github-actions github-actions bot added enhancement New feature or request and removed enhancement New feature or request labels Jun 1, 2023
Signed-off-by: Keming <kemingy94@gmail.com>
Signed-off-by: Keming <kemingy94@gmail.com>
Signed-off-by: Keming <kemingy94@gmail.com>
Signed-off-by: Keming <kemingy94@gmail.com>
@kemingy
Copy link
Member Author

kemingy commented Jun 12, 2023

@lkevinzc It's ready to review.

Signed-off-by: Keming <kemingy94@gmail.com>
Signed-off-by: Keming <kemingy94@gmail.com>
Signed-off-by: Keming <kemingy94@gmail.com>
Signed-off-by: Keming <kemingy94@gmail.com>
@@ -263,6 +284,8 @@ def coordinate(self):
while not self.shutdown.is_set():
try:
_, ids, payloads = protocol_recv()
Copy link
Member

@lkevinzc lkevinzc Jun 14, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since both normal server and sse server will fetch request from this protocol_recv, chances are that sse server will get data from /inference request when clients query the wrong endpoint. In this case, the server will process the data but the client will get nothing.

Similarly, if the client query a normal server with /sse_inference, weird thing will happen.

We'd better to handle this by sending error for now; for future we should not have both endpoints exposed by default, but use multi-route kind of feature to support the specified endpoint only when the server implements it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. impl sse but called /inference => cannot find the stream event channel (info log), but return nothing to the client

  2. impl inference but called /sse_inference => processed but return nothing

Shall we return an error if it cannot find the stream event channel? This might not be 100% accurate.

Not sure how to deal with the 2. under the current design.

Copy link
Member

@lkevinzc lkevinzc Jun 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to do a check at server side, knowing whether it is a normal sever or sse server, then send this info to rust side. In doing so, the rust side will know whether it should reject the requests from non-implemented route. Another benefit is that these rejected requests will never go to python side for processing.

A simple possible implementation is to let users set a class attribute to the sse worker? Only when it is set the send_stream_event can be called normally.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This might need to implement an SSE worker class. But passing the type of worker might require a more complex config file than the current command line arguments.

@lkevinzc lkevinzc self-requested a review June 15, 2023 05:05
Copy link
Member

@lkevinzc lkevinzc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed, we won't release until some refactoring is done and the issue mentioned in the comment is resolved. But we merge this first.

@lkevinzc lkevinzc added this pull request to the merge queue Jun 15, 2023
Merged via the queue into mosecorg:main with commit 1654798 Jun 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FEATURE] Streaming output
2 participants