-
Notifications
You must be signed in to change notification settings - Fork 6
feat(pipeline): implement pause and resume functionality for prompt exec control #499
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ecution - Updated `pyproject.toml` and `requirements.txt` to use the latest version of `pytrickle`. - Enhanced `VideoStreamTrack` to support stopping and resuming prompts. - Added new endpoints for pausing and resuming prompts in the server. - Implemented pause and resume methods in `ComfyStreamClient` and `Pipeline` classes. - Improved error handling and logging for prompt control actions.
server/app.py
Outdated
| logger.error(f"Unexpected error in frame collection: {str(e)}") | ||
| finally: | ||
| await self.pipeline.cleanup() | ||
| await self.pipeline.stop_prompts(cleanup=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any particular reason not to default to True in the stop_prompts method?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The await self.pipeline.stop_prompts(cleanup=True) option shuts down ComfyUI entirely, which has an impact on startup time. I think for this interface pattern pause_prompts, resume_prompts, stop_prompts, the default is best to not cleanup. stop_prompts is only for stopping (and clearing) the running workflow prompts.
There is a similar method pipeline.cleanup that overlaps with the stop_prompts(cleanup=True) option, I can at least tidy this up to avoid duplicate methods
server/app.py
Outdated
| await pipeline.set_prompts(prompts) | ||
| logger.info("[Offer] Set workflow prompts") | ||
| await pipeline.resume_prompts() | ||
| logger.info("[Offer] Set workflow prompts and resumed execution") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: not sure I like this pattern of explicitly adding context such as [Offer]. I would prefer to see the logger have that as a parameter
ala:
function = "offer"
logger.info("{} Set workflow prompts and resumed execution").format(function)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this would go along with a larger refactor of loggers and we should track as a separate enhancement
server/app.py
Outdated
| "success": True | ||
| } | ||
| channel.send(json.dumps(response)) | ||
| elif params.get("type") == "stop_prompts": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this seems busy - maybe a case statement with function calls would be more readable?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, this block has become very long. I'll refactor
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
server/byoc.py
Outdated
| # Add pause endpoint | ||
| async def pause_handler(request): | ||
| try: | ||
| # Fire-and-forget: do not await pause |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a little elaboration in these comments could be helpful
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed these handlers, they are not actually useful for byoc pipelines and would break the encoder. Pause/resume is only needed as an internal function of prompt execution within the pipeline 8715fb6
| logger.info("Stopping ComfyStream client prompt execution") | ||
| try: | ||
| await self.pipeline.client.cleanup() | ||
| await self.pipeline.stop_prompts(cleanup=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
see previous
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See response above #499 (comment)
…cated async functions for each control action (get_nodes, update_prompts, update_resolution, pause_prompts, resume_prompts, stop_prompts) - Added success response handling for prompt updates and resolution changes
…age functions - Added a dedicated `send_error_response` function to streamline error handling across various control actions. - Implemented error responses for missing parameters and exceptions in `update_prompts`, `update_resolution`, `pause_prompts`, `resume_prompts`, and `stop_prompts` handlers.
- Updated `AudioStreamTrack` to stop prompts with cleanup option. - Improved prompt cancellation logic in `cancel_collect_frames`. - Added immediate stopping of prompts in `ComfyStreamFrameProcessor` to prevent overlap during updates. - Refactored `ComfyStreamClient` to manage prompt execution state more effectively, including new methods for immediate stopping and fallback to passthrough workflow. - Enhanced `Pipeline` class with immediate stop functionality for prompt execution. - Improved error handling and logging across prompt management functions.
…agement - Updated `pyproject.toml` and `requirements.txt` to use the latest version of `pytrickle`. - Refactored prompt management in `ComfyStreamFrameProcessor` and `Pipeline` classes to improve loading overlay functionality. - Introduced new methods for applying prompts with loading messages and warmup handling. - Enhanced error handling during prompt application and streaming state transitions. - Added support for managing pipeline states with `PipelineState` and `PipelineStateManager` to streamline lifecycle management.
… execution immediately
- Updated `pyproject.toml` to use the latest version of `pytrickle`. - Modified `.devcontainer/devcontainer.json` to include additional port forwarding and bind mounts. - Refactored loading overlay management in `ComfyStreamFrameProcessor` and `Pipeline` classes for improved user experience. - Removed deprecated loading methods and streamlined loading state handling. - Enhanced error handling and logging during prompt processing and streaming state transitions.
…l methods - Removed the `_stop_event` from `ComfyStreamClient` as it was no longer needed. - Updated `pause_prompts` and `resume_prompts` methods to eliminate unnecessary async calls. - Adjusted `PipelineStateManager` to call `pause_prompts` directly without awaiting, streamlining state management.
… endpoint in byoc.py
- Updated `pyproject.toml` and `requirements.txt` to use the latest commit of `pytrickle`. - Improved logging and handling of stream start parameters in `ComfyStreamFrameProcessor`. - Refactored warmup overlay management to streamline user experience during streaming. - Added support for processing prompt payloads and managing warmup states more effectively.
… configuration for better control. - Refactored `ComfyStreamFrameProcessor` to simplify initialization and remove unused warmup overlay logic.
…rlays - Introduced a constant for default timeout in `byoc.py` to improve maintainability. - Updated `frame_processor.py` to use the new timeout constant for overlay processing.
…ual track classes. - Added centralized cleanup handling in the `offer` function for connection failures and closures. - Improved logging for pipeline cleanup operations to enhance error tracking.
This pull request introduces the Frame Overlay feature from pytrickle to assist with hot-reloading comfystream for prompt updates.
This change also refactors the ComfyStream
pipelineandclientinterface to provide control over prompt execution. New control message handlers were added for pausing, resuming, and stopping prompts.Warmup logic has also been improved to utilize the
on_stream_starthandler from pytrickle with initial stream parameters.BYOC overlay and processor configuration:
OverlayConfigwith a progress bar overlay into the BYOC stream processor, providing visual feedback during loading and warmup.ComfyStream Server API Changes (app.py)
pause_prompts,resume_prompts, andstop_promptson the data channel, allowing runtime control over prompt execution. These handlers send structured JSON responses for success and error cases, improving reliability and client feedback.offerand related endpoints to useapply_promptsandstart_streaming, ensuring prompts are properly applied and pipeline execution is started in a unified way.stop_prompts(cleanup=True)for both frame and audio collection, ensuring prompt execution is properly stopped and resources are cleaned up.Pipeline initialization and warmup:
on_stream_start, simplifying the startup flow.Logging and configuration improvements:
comfy.cmd.executionandcomfystreamloggers to suppress verbose execution logs, and set thecomfy.model_detectionlogger to warning level for less noise. This was necessary to suppress transient errors during workflow prompt switching (affected mostly controlnet tensorrt workflows)Testing completed:
In comfystream server (app.py):
In BYOC.py
/healthshows LOADING until default prompt initializes comfyui fully, then showsIDLE