Skip to content

Conversation

@eliteprox
Copy link
Collaborator

@eliteprox eliteprox commented Nov 10, 2025

This pull request introduces the Frame Overlay feature from pytrickle to assist with hot-reloading comfystream for prompt updates.

This change also refactors the ComfyStream pipeline and client interface to provide control over prompt execution. New control message handlers were added for pausing, resuming, and stopping prompts.

Warmup logic has also been improved to utilize the on_stream_start handler from pytrickle with initial stream parameters.

BYOC overlay and processor configuration:

  • Integrated OverlayConfig with a progress bar overlay into the BYOC stream processor, providing visual feedback during loading and warmup.

ComfyStream Server API Changes (app.py)

  • Added new control message handlers for pause_prompts, resume_prompts, and stop_prompts on the data channel, allowing runtime control over prompt execution. These handlers send structured JSON responses for success and error cases, improving reliability and client feedback.
  • Refactored prompt application logic in offer and related endpoints to use apply_prompts and start_streaming, ensuring prompts are properly applied and pipeline execution is started in a unified way.
  • Changed pipeline cleanup calls to use stop_prompts(cleanup=True) for both frame and audio collection, ensuring prompt execution is properly stopped and resources are cleaned up.

Pipeline initialization and warmup:

  • Added explicit pipeline initialization during server startup to ensure pipeline resources are ready before handling requests.
  • Refactored BYOC startup to remove redundant warmup endpoint and clarified that warmup runs automatically as part of on_stream_start, simplifying the startup flow.

Logging and configuration improvements:

  • Added timeout filters to both comfy.cmd.execution and comfystream loggers to suppress verbose execution logs, and set the comfy.model_detection logger to warning level for less noise. This was necessary to suppress transient errors during workflow prompt switching (affected mostly controlnet tensorrt workflows)
  • Improved logging messages to clarify server startup and workflow parameter expectations for BYOC.

Testing completed:

  • In comfystream server (app.py):

    • Tested new stream with sd 15 and audio transcription workflows
    • Stopped and started new stream with no issues, warmup transitions better than before
    • Prompt updates
  • In BYOC.py

    • /health shows LOADING until default prompt initializes comfyui fully, then shows IDLE
    • false starts/stops during loading phase are rejected and do not affect pipeline state or cause double loading
    • Cold startups with any workflow are successful, sd15, audio transcription, etc.
    • Prompt updates for active streams with any workflow, even modality changes from video processing to audio processing update frame routing correctly
    • Frame overlay auto-loading/auto-hide is consistent

…ecution

- Updated `pyproject.toml` and `requirements.txt` to use the latest version of `pytrickle`.
- Enhanced `VideoStreamTrack` to support stopping and resuming prompts.
- Added new endpoints for pausing and resuming prompts in the server.
- Implemented pause and resume methods in `ComfyStreamClient` and `Pipeline` classes.
- Improved error handling and logging for prompt control actions.
@eliteprox eliteprox changed the title feat(pipeline): implement pause and resume functionality for prompt ex… feat(pipeline): implement pause and resume functionality for prompt exec control Nov 10, 2025
server/app.py Outdated
logger.error(f"Unexpected error in frame collection: {str(e)}")
finally:
await self.pipeline.cleanup()
await self.pipeline.stop_prompts(cleanup=True)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any particular reason not to default to True in the stop_prompts method?

Copy link
Collaborator Author

@eliteprox eliteprox Nov 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The await self.pipeline.stop_prompts(cleanup=True) option shuts down ComfyUI entirely, which has an impact on startup time. I think for this interface pattern pause_prompts, resume_prompts, stop_prompts, the default is best to not cleanup. stop_prompts is only for stopping (and clearing) the running workflow prompts.

There is a similar method pipeline.cleanup that overlaps with the stop_prompts(cleanup=True) option, I can at least tidy this up to avoid duplicate methods

server/app.py Outdated
await pipeline.set_prompts(prompts)
logger.info("[Offer] Set workflow prompts")
await pipeline.resume_prompts()
logger.info("[Offer] Set workflow prompts and resumed execution")

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: not sure I like this pattern of explicitly adding context such as [Offer]. I would prefer to see the logger have that as a parameter
ala:
function = "offer"
logger.info("{} Set workflow prompts and resumed execution").format(function)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this would go along with a larger refactor of loggers and we should track as a separate enhancement

server/app.py Outdated
"success": True
}
channel.send(json.dumps(response))
elif params.get("type") == "stop_prompts":

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this seems busy - maybe a case statement with function calls would be more readable?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, this block has become very long. I'll refactor

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Refactored in 754929f and c9abfee

server/byoc.py Outdated
# Add pause endpoint
async def pause_handler(request):
try:
# Fire-and-forget: do not await pause

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

a little elaboration in these comments could be helpful

Copy link
Collaborator Author

@eliteprox eliteprox Nov 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed these handlers, they are not actually useful for byoc pipelines and would break the encoder. Pause/resume is only needed as an internal function of prompt execution within the pipeline 8715fb6

logger.info("Stopping ComfyStream client prompt execution")
try:
await self.pipeline.client.cleanup()
await self.pipeline.stop_prompts(cleanup=True)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see previous

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See response above #499 (comment)

…cated async functions for each control action (get_nodes, update_prompts, update_resolution, pause_prompts, resume_prompts, stop_prompts)

- Added success response handling for prompt updates and resolution changes
…age functions

- Added a dedicated `send_error_response` function to streamline error handling across various control actions.
- Implemented error responses for missing parameters and exceptions in `update_prompts`, `update_resolution`, `pause_prompts`, `resume_prompts`, and `stop_prompts` handlers.
@eliteprox eliteprox requested a review from pschroedl November 11, 2025 19:35
- Updated `AudioStreamTrack` to stop prompts with cleanup option.
- Improved prompt cancellation logic in `cancel_collect_frames`.
- Added immediate stopping of prompts in `ComfyStreamFrameProcessor` to prevent overlap during updates.
- Refactored `ComfyStreamClient` to manage prompt execution state more effectively, including new methods for immediate stopping and fallback to passthrough workflow.
- Enhanced `Pipeline` class with immediate stop functionality for prompt execution.
- Improved error handling and logging across prompt management functions.
…agement

- Updated `pyproject.toml` and `requirements.txt` to use the latest version of `pytrickle`.
- Refactored prompt management in `ComfyStreamFrameProcessor` and `Pipeline` classes to improve loading overlay functionality.
- Introduced new methods for applying prompts with loading messages and warmup handling.
- Enhanced error handling during prompt application and streaming state transitions.
- Added support for managing pipeline states with `PipelineState` and `PipelineStateManager` to streamline lifecycle management.
- Updated `pyproject.toml` to use the latest version of `pytrickle`.
- Modified `.devcontainer/devcontainer.json` to include additional port forwarding and bind mounts.
- Refactored loading overlay management in `ComfyStreamFrameProcessor` and `Pipeline` classes for improved user experience.
- Removed deprecated loading methods and streamlined loading state handling.
- Enhanced error handling and logging during prompt processing and streaming state transitions.
…l methods

- Removed the `_stop_event` from `ComfyStreamClient` as it was no longer needed.
- Updated `pause_prompts` and `resume_prompts` methods to eliminate unnecessary async calls.
- Adjusted `PipelineStateManager` to call `pause_prompts` directly without awaiting, streamlining state management.
- Updated `pyproject.toml` and `requirements.txt` to use the latest commit of `pytrickle`.
- Improved logging and handling of stream start parameters in `ComfyStreamFrameProcessor`.
- Refactored warmup overlay management to streamline user experience during streaming.
- Added support for processing prompt payloads and managing warmup states more effectively.
… configuration for better control.

- Refactored `ComfyStreamFrameProcessor` to simplify initialization and remove unused warmup overlay logic.
…rlays

- Introduced a constant for default timeout in `byoc.py` to improve maintainability.
- Updated `frame_processor.py` to use the new timeout constant for overlay processing.
…ual track classes.

- Added centralized cleanup handling in the `offer` function for connection failures and closures.
- Improved logging for pipeline cleanup operations to enhance error tracking.
@eliteprox eliteprox merged commit 48fcca4 into main Nov 25, 2025
5 checks passed
@eliteprox eliteprox deleted the feat/prompt-exec-ctrl branch November 25, 2025 22:57
@eliteprox eliteprox linked an issue Dec 8, 2025 that may be closed by this pull request
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

ComfyStream Pipeline Pause/Resume/Stop ComfyStream warmup loading screen Develop interface for comfystream.pipeline to start/stop workers

4 participants