Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

In kai-rpc-server add progress tracking and cancellation support for the rpc requests #525

Open
sjd78 opened this issue Dec 11, 2024 · 2 comments
Labels
enhancement New feature or request ide-plugin IDE Related Issues priority/important-longterm Important over the long term, but may not be staffed and/or may need multiple releases to complete. priority/nextup Issues we want to address soon rpc-server
Milestone

Comments

@sjd78
Copy link
Member

sjd78 commented Dec 11, 2024

In order to provide a better user experience with long running requests, please add at least cancellation support. Progress tracking is also highly desirable.

Issue konveyor/editor-extensions#149 is blocked until at least cancellation support is available.

In general, LSP servers can use Work Done Progress or a more generic Cancellation support.

In either of these approaches, the rpc request would include a generated id. The server can send notification "event" messages back with progress notes. While the long running request from vscode extension is in progress, vscode could send a separate cancel request with the original generated id. The kai-rpc-server could then abort the process gracefully, potentially returning "partial" results.

As far as I know, the only way to actually abort an analysis or solution request is to kill the kai server process. Since killing the solution process could leave a dirty file tree behind, that's not a great option.

@shawn-hurley shawn-hurley added enhancement New feature or request priority/important-longterm Important over the long term, but may not be staffed and/or may need multiple releases to complete. ide-plugin IDE Related Issues rpc-server priority/nextup Issues we want to address soon labels Jan 10, 2025
@dymurray dymurray added this to the v0.1.0 milestone Jan 21, 2025
@rromannissen
Copy link

+1 on this. I think surfacing information on where the time is being spent with each fix is important for the user to understand what is going on, especially if we are dealing with constrained LLMs that might provide slower responses.

@shawn-hurley
Copy link
Contributor

We need to think about this differently, rather than just text.

I want to consider exposing the task manager state to the user so they can see what is being worked on and why.

For instance, we should show the seeded tasks and any tasks that would be worked on (within the max priority) and which task is being worked on.

If the settings are correct and there are child issues that are going to be worked, we should add those as well as they come up, we should also show when those are being worked.

When something is being worked on, we should show the agent that is working on it (dependency, analyzer, maven, etc.). While waiting for the LLM, we should have some icon that says that is what is happening.

I want to get some mock-ups and consider making this the resolution details page, not a chat interface.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request ide-plugin IDE Related Issues priority/important-longterm Important over the long term, but may not be staffed and/or may need multiple releases to complete. priority/nextup Issues we want to address soon rpc-server
Projects
Status: No status
Development

No branches or pull requests

4 participants