Skip to content

🔧 feat: improve the streaming #181

Merged
Mile-Away merged 3 commits intomainfrom
feature/better-streaming
Jan 19, 2026
Merged

🔧 feat: improve the streaming #181
Mile-Away merged 3 commits intomainfrom
feature/better-streaming

Conversation

@xinquiry
Copy link
Collaborator

变更内容

  • 新功能
  • 修复 Bug
  • 增强重构
  • 其他(请描述)

简要描述本次 PR 的主要变更内容。

相关 Issue

请关联相关 Issue(如有):#编号

检查清单

默认已勾选,如不满足,请检查。

  • 已在本地测试通过
  • 已补充/更新相关文档
  • 已添加测试用例
  • 代码风格已经过 pre-commit 钩子检查

其他说明

如有特殊说明或注意事项,请补充。

xinquiry and others added 2 commits January 19, 2026 16:27
Backend changes:
- Create LLM before graph compilation for proper streaming interception
- Change react config from COMPONENT (subgraph) to direct LLM+TOOL nodes
- Make build_graph async in all components to enable pre-creation of LLM
- Skip final AIMessage only when buffer has content (fix deep research)
- Move node transition detection before AIMessage skip check
- Access messages directly from state to preserve BaseMessage types

Frontend changes:
- Fix duplicate detection in streaming_chunk handler (condition was inverted)

Co-Authored-By: Claude <noreply@anthropic.com>
Copilot AI review requested due to automatic review settings January 19, 2026 11:01
Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry @xinquiry, you have reached your weekly rate limit of 500000 diff characters.

Please try again later or upgrade to continue using Sourcery

@codecov
Copy link

codecov bot commented Jan 19, 2026

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR improves streaming functionality by fixing duplicate content detection and enabling proper token-by-token streaming through LangGraph's messages mode.

Changes:

  • Fixed duplicate content detection logic in the frontend to correctly identify when backend sends full content after incremental streaming
  • Refactored graph builder and components to create LLM instances before graph compilation, enabling proper streaming interception
  • Added celery-types dependency for better type safety with Celery tasks
  • Restructured ReAct agent from component-based to direct LLM+TOOL nodes for improved streaming support

Reviewed changes

Copilot reviewed 10 out of 12 changed files in this pull request and generated no comments.

Show a summary per file
File Description
web/src/store/slices/chatSlice.ts Improved duplicate content detection logic and added debug logging for streaming events
service/app/core/chat/langchain.py Refactored message streaming to clear buffer on node transitions and skip final AIMessage when streaming is active
service/app/api/ws/v1/chat.py Removed type ignore comment now that celery-types provides proper type hints
service/app/agents/graph_builder.py Made build() and node builders async to create LLM before compilation; components now return compiled graphs directly
service/app/agents/factory.py Added streaming: True flag to LLM model kwargs to enable token-by-token output
service/app/agents/components/react.py Made build_graph async and moved LLM creation before graph compilation
service/app/agents/components/executable.py Updated build_graph signature to async with documentation explaining the requirement
service/app/agents/components/deep_research/components.py Updated all component build_graph methods to async
service/app/agents/builtin/react.py Restructured from component-based to direct LLM+TOOL nodes for better streaming
service/pyproject.toml Added celery-types>=0.24.0 dependency
service/uv.lock Added celery-types package lock entries
.gitignore Added .claude/ directory to ignore list

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@Mile-Away Mile-Away merged commit 3282291 into main Jan 19, 2026
9 of 10 checks passed
@Mile-Away Mile-Away deleted the feature/better-streaming branch January 19, 2026 14:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants