-
Notifications
You must be signed in to change notification settings - Fork 191
feat: show document artifact after generating report #658
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🦋 Changeset detectedLatest commit: b1e59f6 The changes in this PR will be included in the next version bump. This PR includes changesets to release 4 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
|
Warning Rate limit exceeded@thucpn has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 3 minutes and 2 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (1)
""" WalkthroughThis change updates both Python and TypeScript deep research workflows to emit a document artifact event immediately after generating a report, ensuring the artifact is displayed without delay. It also updates related dependencies, imports additional styles, and modifies ESLint configuration for the server package. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant Workflow (Python/TS)
participant Artifact Event Stream
User->>Workflow (Python/TS): Request report generation
Workflow (Python/TS)->>Workflow (Python/TS): Generate report (streaming or non-streaming)
Workflow (Python/TS)->>User: Stream report content (incremental events)
Workflow (Python/TS)->>Artifact Event Stream: Emit document artifact event (full report)
Artifact Event Stream->>User: Display document artifact immediately
Possibly related issues
Possibly related PRs
Suggested reviewers
Poem
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
packages/server/next/app/components/ui/chat/chat-message-content.tsx
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (2)
packages/server/project-config/eslint.config.mjs (1)
21-21: Question: Is the empty-object-type rule disablement needed globally?Turning off
@typescript-eslint/no-empty-object-typefor the entire project may hide unintended uses of empty object types. Consider scoping this override to only the files or patterns that actually require it, or using inline// eslint-disable-next-linecomments where necessary to minimize global impact.packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (1)
381-382: Move the import to the top of the file for consistency.The
AsyncGeneratorimport should be moved to the top-level imports section for better code organization and consistency with Python conventions.+from collections.abc import AsyncGenerator import logging import os import uuid from typing import List, Literal, Optional ... - from collections.abc import AsyncGenerator
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (7)
.changeset/tame-wolves-obey.md(1 hunks)packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py(2 hunks)packages/create-llama/templates/components/use-cases/typescript/deep_research/src/app/workflow.ts(2 hunks)packages/server/next/app/layout.tsx(1 hunks)packages/server/package.json(1 hunks)packages/server/project-config/eslint.config.mjs(1 hunks)packages/server/project-config/package.json(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/create-llama/templates/components/use-cases/typescript/deep_research/src/app/workflow.ts (1)
packages/server/src/utils/events.ts (1)
artifactEvent(121-124)
⏰ Context from checks skipped due to timeout of 90000ms (48)
- GitHub Check: build
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
🔇 Additional comments (9)
packages/server/project-config/package.json (1)
44-44: LGTM: Dependency update supports new artifact features.The version bump from 0.5.2 to 0.5.5 for
@llamaindex/chat-uialigns with the feature addition and is consistent across packages.packages/server/package.json (1)
68-68: LGTM: Consistent dependency update.The version update maintains consistency with the project-config package and supports the new artifact event functionality.
packages/server/next/app/layout.tsx (1)
4-4: LGTM: Added necessary styles for artifact rendering.The editor.css import is appropriately placed and necessary to support the document artifact display functionality.
.changeset/tame-wolves-obey.md (1)
1-7: LGTM: Well-documented changeset.The changeset properly documents the feature addition with appropriate patch-level versioning and clear description.
packages/create-llama/templates/components/use-cases/typescript/deep_research/src/app/workflow.ts (2)
1-1: LGTM: Proper import addition for artifact functionality.The
artifactEventimport is correctly added to support the new document artifact feature.
342-357: LGTM: Well-implemented artifact event emission.The implementation correctly:
- Sends the artifact event after streaming completion but before the final stop event
- Captures the complete response content
- Uses proper data structure with appropriate metadata (title, content, type)
- Provides immediate visibility of the generated document as intended
The timing and data structure align perfectly with the PR objective to show document artifacts after report generation.
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (3)
26-36: LGTM! Import additions support the new artifact functionality.The new imports are correctly added to support the artifact emission feature. The imports include all necessary classes for creating and emitting document artifacts after report generation.
400-412: LGTM! Artifact emission correctly implements the PR objective.The artifact creation and emission logic correctly implements the feature to show document artifacts after generating reports. The structure includes all necessary metadata and the content is properly set to the generated report text.
415-415:✅ Verification successful
Verify the impact of returning empty string instead of the report content.
The change from returning the response object to returning an empty string could be a breaking change if downstream code expects the actual report content in the StopEvent result.
🏁 Script executed:
#!/bin/bash # Description: Search for usage of StopEvent result from DeepResearchWorkflow to verify impact # Expected: Find code that depends on the StopEvent result containing the report content # Search for workflow usage and StopEvent handling rg -A 10 -B 5 "DeepResearchWorkflow.*run\|StopEvent.*result" --type py # Search for any code that processes the workflow result rg -A 5 "\.run\(" --type py | rg -A 5 -B 5 "result"Length of output: 2466
No downstream dependencies on DeepResearchWorkflow’s StopEvent result detected
I searched the entire codebase and found no references to
DeepResearchWorkflowor handling of itsStopEvent.result. Since this is a template file (used only when scaffolding new projects) and nothing in our repo consumes that return value, returning an empty string here won’t break any existing code.
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
python/llama-index-server/llama_index/server/models/artifacts.py (1)
3-3: Remove unused importDict.
Dictis imported but not used anywhere in this module.Apply this diff:
-from typing import Dict, List, Literal, Optional, Union +from typing import List, Literal, Optional, Union🧰 Tools
🪛 Ruff (0.11.9)
3-3:
typing.Dictimported but unusedRemove unused import:
typing.Dict(F401)
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py(3 hunks)python/llama-index-server/llama_index/server/models/__init__.py(2 hunks)python/llama-index-server/llama_index/server/models/artifacts.py(2 hunks)
✅ Files skipped from review due to trivial changes (1)
- python/llama-index-server/llama_index/server/models/init.py
🚧 Files skipped from review as they are similar to previous changes (1)
- packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py
🧰 Additional context used
🧬 Code Graph Analysis (1)
python/llama-index-server/llama_index/server/models/artifacts.py (1)
packages/server/src/utils/events.ts (1)
DocumentArtifactData(107-112)
🪛 Ruff (0.11.9)
python/llama-index-server/llama_index/server/models/artifacts.py
3-3: typing.Dict imported but unused
Remove unused import: typing.Dict
(F401)
🪛 Pylint (3.3.7)
python/llama-index-server/llama_index/server/models/artifacts.py
[refactor] 24-24: Too few public methods (0/2)
(R0903)
[refactor] 29-29: Too few public methods (0/2)
(R0903)
⏰ Context from checks skipped due to timeout of 90000ms (57)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: Unit Tests (windows-latest, 3.9)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: Unit Tests (ubuntu-latest, 3.9)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
- GitHub Check: lint
🔇 Additional comments (1)
python/llama-index-server/llama_index/server/models/artifacts.py (1)
33-33: Ensure new model is exported.
Verify thatDocumentArtifactSourceis added topython/llama-index-server/llama_index/server/models/__init__.pyso it’s publicly accessible.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (1)
python/llama-index-server/llama_index/server/utils/stream.py (1)
28-40: Use more specific type checking for better type safety.The
isinstance(res, AsyncGenerator)check was flagged in past reviews as not robust enough. It should check against the specific LlamaIndex type for better type safety.- if isinstance(res, AsyncGenerator): + if isinstance(res, CompletionResponseAsyncGen):This ensures we're specifically handling LlamaIndex's streaming response type rather than any generic async generator.
🧹 Nitpick comments (1)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (1)
4-4: Remove unused import.The
AsyncGeneratorimport is not used in this file since the streaming logic has been moved to the utility function.-from typing import List, Literal, Optional, AsyncGenerator +from typing import List, Literal, Optional🧰 Tools
🪛 Ruff (0.11.9)
4-4:
typing.AsyncGeneratorimported but unusedRemove unused import:
typing.AsyncGenerator(F401)
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py(3 hunks)python/llama-index-server/llama_index/server/utils/stream.py(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (4)
python/llama-index-server/llama_index/server/models/artifacts.py (5)
ArtifactEvent(58-66)ArtifactType(13-15)Artifact(36-55)DocumentArtifactData(29-33)DocumentArtifactSource(24-26)python/llama-index-server/llama_index/server/models/chat.py (1)
ChatRequest(29-44)python/llama-index-server/llama_index/server/models/source_nodes.py (1)
SourceNodesEvent(10-22)python/llama-index-server/llama_index/server/utils/stream.py (1)
handle_llm_response_stream(10-45)
🪛 Ruff (0.11.9)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py
4-4: typing.AsyncGenerator imported but unused
Remove unused import: typing.AsyncGenerator
(F401)
⏰ Context from checks skipped due to timeout of 90000ms (57)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: Unit Tests (windows-latest, 3.9)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: lint
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
- GitHub Check: Unit Tests (ubuntu-latest, 3.9)
🔇 Additional comments (5)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (4)
26-37: Well-organized imports for artifact functionality.The new imports are properly structured and include all necessary classes for creating and emitting document artifacts.
380-380: Good use of the utility function for unified response handling.Using
handle_llm_response_streamprovides a clean abstraction for handling both streaming and non-streaming responses, improving code maintainability.
382-400: Perfect implementation of artifact event creation.This implementation exactly matches the pattern suggested by Marcus in the past review comments, creating a document artifact with proper metadata and source references.
403-403: Appropriate return value change.Returning an empty string is consistent with the new approach where the actual content is delivered via the artifact event rather than the workflow result.
python/llama-index-server/llama_index/server/utils/stream.py (1)
10-45: Well-designed utility function with proper documentation.The function provides a clean abstraction for handling both streaming and non-streaming responses, with good documentation and error handling through the
chunk.delta or ""pattern.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (1)
4-4: Remove unused import.The
AsyncGeneratorimport is not directly used in this file since the type checking is handled internally by thewrite_response_to_streamutility function.-from typing import List, Literal, Optional, AsyncGenerator +from typing import List, Literal, Optional🧰 Tools
🪛 Ruff (0.11.9)
4-4:
typing.AsyncGeneratorimported but unusedRemove unused import:
typing.AsyncGenerator(F401)
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py(3 hunks)python/llama-index-server/llama_index/server/utils/stream.py(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- python/llama-index-server/llama_index/server/utils/stream.py
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (3)
python/llama-index-server/llama_index/server/models/artifacts.py (5)
ArtifactEvent(58-66)ArtifactType(13-15)Artifact(36-55)DocumentArtifactData(29-33)DocumentArtifactSource(24-26)python/llama-index-server/llama_index/server/models/source_nodes.py (1)
SourceNodesEvent(10-22)python/llama-index-server/llama_index/server/utils/stream.py (1)
write_response_to_stream(10-45)
🪛 Ruff (0.11.9)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py
4-4: typing.AsyncGenerator imported but unused
Remove unused import: typing.AsyncGenerator
(F401)
⏰ Context from checks skipped due to timeout of 90000ms (57)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: Unit Tests (windows-latest, 3.9)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: Unit Tests (ubuntu-latest, 3.9)
- GitHub Check: lint
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
🔇 Additional comments (3)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (3)
26-37: Well-structured imports for artifact functionality.The imports are correctly organized and include all the necessary components for creating and emitting document artifacts. The addition of the
write_response_to_streamutility function import aligns with the pattern of abstracting streaming complexity.
380-401: Excellent implementation of artifact event emission.This implementation perfectly follows the pattern suggested in previous reviews by @marcusschiesser. The code correctly:
- Uses the
write_response_to_streamutility to handle both streaming and non-streaming responses- Creates an
ArtifactEventwith proper document artifact structure- Includes all required metadata (timestamp, title, content, type, sources)
- Maps context nodes to document sources appropriately
The abstraction of response handling complexity allows the workflow to focus on the business logic of creating and emitting the artifact.
403-403: Appropriate return value change.Returning an empty string makes sense since the actual response content is now handled through the stream and artifact event, maintaining clean separation of concerns.
Summary by CodeRabbit
New Features
Style
Chores