Skip to content

Conversation

@thucpn
Copy link
Collaborator

@thucpn thucpn commented May 28, 2025

Summary by CodeRabbit

  • New Features

    • The generated document artifact is now displayed immediately after creating a report, providing instant access to the report content.
    • Reports include detailed source citations to enhance traceability and context.
  • Style

    • Improved editor appearance by updating and adding related CSS styles.
  • Chores

    • Upgraded UI dependencies for enhanced stability and features.
    • Updated linting configuration to streamline code quality checks.

@changeset-bot
Copy link

changeset-bot bot commented May 28, 2025

🦋 Changeset detected

Latest commit: b1e59f6

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 4 packages
Name Type
create-llama Patch
@llamaindex/server Patch
@create-llama/llama-index-server Patch
llamaindex-server-examples Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@coderabbitai
Copy link

coderabbitai bot commented May 28, 2025

Warning

Rate limit exceeded

@thucpn has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 3 minutes and 2 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📥 Commits

Reviewing files that changed from the base of the PR and between 2b4dbba and b1e59f6.

📒 Files selected for processing (1)
  • packages/server/src/utils/events.ts (1 hunks)

"""

Walkthrough

This change updates both Python and TypeScript deep research workflows to emit a document artifact event immediately after generating a report, ensuring the artifact is displayed without delay. It also updates related dependencies, imports additional styles, and modifies ESLint configuration for the server package.

Changes

File(s) Change Summary
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py Enhanced report step to emit a document artifact event after report generation, handling both streaming and non-streaming results.
packages/create-llama/templates/components/use-cases/typescript/deep_research/src/app/workflow.ts Added emission of a document artifact event after report streaming, including metadata and markdown content.
python/llama-index-server/llama_index/server/models/artifacts.py
python/llama-index-server/llama_index/server/models/init.py
Added DocumentArtifactSource model and included it in exports; extended DocumentArtifactData with optional sources field.
python/llama-index-server/llama_index/server/utils/stream.py Added write_response_to_stream utility to uniformly process streaming and non-streaming LLM responses.
packages/server/src/utils/events.ts Modified DocumentArtifactData type's sources property to include only id fields for citation rendering.
packages/server/next/app/layout.tsx Imported additional editor CSS for improved styling.
packages/server/package.json
packages/server/project-config/package.json
Updated @llamaindex/chat-ui dependency version from 0.5.2 to 0.5.6.
packages/server/project-config/eslint.config.mjs Disabled the "@typescript-eslint/no-empty-object-type" ESLint rule.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant Workflow (Python/TS)
    participant Artifact Event Stream
    User->>Workflow (Python/TS): Request report generation
    Workflow (Python/TS)->>Workflow (Python/TS): Generate report (streaming or non-streaming)
    Workflow (Python/TS)->>User: Stream report content (incremental events)
    Workflow (Python/TS)->>Artifact Event Stream: Emit document artifact event (full report)
    Artifact Event Stream->>User: Display document artifact immediately
Loading

Possibly related issues

Possibly related PRs

Suggested reviewers

  • marcusschiesser
  • leehuwuj

Poem

A rabbit hops and writes with cheer,
Now artifacts appear quite near!
Reports are shown without delay,
Markdown magic on display.
With every hop, the workflow’s bright—
Artifacts in canvas, what a sight!
🐇✨
"""

✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@thucpn thucpn marked this pull request as ready for review June 5, 2025 10:19
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
packages/server/project-config/eslint.config.mjs (1)

21-21: Question: Is the empty-object-type rule disablement needed globally?

Turning off @typescript-eslint/no-empty-object-type for the entire project may hide unintended uses of empty object types. Consider scoping this override to only the files or patterns that actually require it, or using inline // eslint-disable-next-line comments where necessary to minimize global impact.

packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (1)

381-382: Move the import to the top of the file for consistency.

The AsyncGenerator import should be moved to the top-level imports section for better code organization and consistency with Python conventions.

+from collections.abc import AsyncGenerator
 import logging
 import os
 import uuid
 from typing import List, Literal, Optional

...

-        from collections.abc import AsyncGenerator
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between a543a27 and 38ac299.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (7)
  • .changeset/tame-wolves-obey.md (1 hunks)
  • packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (2 hunks)
  • packages/create-llama/templates/components/use-cases/typescript/deep_research/src/app/workflow.ts (2 hunks)
  • packages/server/next/app/layout.tsx (1 hunks)
  • packages/server/package.json (1 hunks)
  • packages/server/project-config/eslint.config.mjs (1 hunks)
  • packages/server/project-config/package.json (1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/create-llama/templates/components/use-cases/typescript/deep_research/src/app/workflow.ts (1)
packages/server/src/utils/events.ts (1)
  • artifactEvent (121-124)
⏰ Context from checks skipped due to timeout of 90000ms (48)
  • GitHub Check: build
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
🔇 Additional comments (9)
packages/server/project-config/package.json (1)

44-44: LGTM: Dependency update supports new artifact features.

The version bump from 0.5.2 to 0.5.5 for @llamaindex/chat-ui aligns with the feature addition and is consistent across packages.

packages/server/package.json (1)

68-68: LGTM: Consistent dependency update.

The version update maintains consistency with the project-config package and supports the new artifact event functionality.

packages/server/next/app/layout.tsx (1)

4-4: LGTM: Added necessary styles for artifact rendering.

The editor.css import is appropriately placed and necessary to support the document artifact display functionality.

.changeset/tame-wolves-obey.md (1)

1-7: LGTM: Well-documented changeset.

The changeset properly documents the feature addition with appropriate patch-level versioning and clear description.

packages/create-llama/templates/components/use-cases/typescript/deep_research/src/app/workflow.ts (2)

1-1: LGTM: Proper import addition for artifact functionality.

The artifactEvent import is correctly added to support the new document artifact feature.


342-357: LGTM: Well-implemented artifact event emission.

The implementation correctly:

  • Sends the artifact event after streaming completion but before the final stop event
  • Captures the complete response content
  • Uses proper data structure with appropriate metadata (title, content, type)
  • Provides immediate visibility of the generated document as intended

The timing and data structure align perfectly with the PR objective to show document artifacts after report generation.

packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (3)

26-36: LGTM! Import additions support the new artifact functionality.

The new imports are correctly added to support the artifact emission feature. The imports include all necessary classes for creating and emitting document artifacts after report generation.


400-412: LGTM! Artifact emission correctly implements the PR objective.

The artifact creation and emission logic correctly implements the feature to show document artifacts after generating reports. The structure includes all necessary metadata and the content is properly set to the generated report text.


415-415:

✅ Verification successful

Verify the impact of returning empty string instead of the report content.

The change from returning the response object to returning an empty string could be a breaking change if downstream code expects the actual report content in the StopEvent result.


🏁 Script executed:

#!/bin/bash
# Description: Search for usage of StopEvent result from DeepResearchWorkflow to verify impact
# Expected: Find code that depends on the StopEvent result containing the report content

# Search for workflow usage and StopEvent handling
rg -A 10 -B 5 "DeepResearchWorkflow.*run\|StopEvent.*result" --type py

# Search for any code that processes the workflow result
rg -A 5 "\.run\(" --type py | rg -A 5 -B 5 "result"

Length of output: 2466


No downstream dependencies on DeepResearchWorkflow’s StopEvent result detected

I searched the entire codebase and found no references to DeepResearchWorkflow or handling of its StopEvent.result. Since this is a template file (used only when scaffolding new projects) and nothing in our repo consumes that return value, returning an empty string here won’t break any existing code.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
python/llama-index-server/llama_index/server/models/artifacts.py (1)

3-3: Remove unused import Dict.
Dict is imported but not used anywhere in this module.

Apply this diff:

-from typing import Dict, List, Literal, Optional, Union
+from typing import List, Literal, Optional, Union
🧰 Tools
🪛 Ruff (0.11.9)

3-3: typing.Dict imported but unused

Remove unused import: typing.Dict

(F401)

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b97dc3e and f521f12.

📒 Files selected for processing (3)
  • packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (3 hunks)
  • python/llama-index-server/llama_index/server/models/__init__.py (2 hunks)
  • python/llama-index-server/llama_index/server/models/artifacts.py (2 hunks)
✅ Files skipped from review due to trivial changes (1)
  • python/llama-index-server/llama_index/server/models/init.py
🚧 Files skipped from review as they are similar to previous changes (1)
  • packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py
🧰 Additional context used
🧬 Code Graph Analysis (1)
python/llama-index-server/llama_index/server/models/artifacts.py (1)
packages/server/src/utils/events.ts (1)
  • DocumentArtifactData (107-112)
🪛 Ruff (0.11.9)
python/llama-index-server/llama_index/server/models/artifacts.py

3-3: typing.Dict imported but unused

Remove unused import: typing.Dict

(F401)

🪛 Pylint (3.3.7)
python/llama-index-server/llama_index/server/models/artifacts.py

[refactor] 24-24: Too few public methods (0/2)

(R0903)


[refactor] 29-29: Too few public methods (0/2)

(R0903)

⏰ Context from checks skipped due to timeout of 90000ms (57)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: Unit Tests (ubuntu-latest, 3.9)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
  • GitHub Check: lint
🔇 Additional comments (1)
python/llama-index-server/llama_index/server/models/artifacts.py (1)

33-33: Ensure new model is exported.
Verify that DocumentArtifactSource is added to python/llama-index-server/llama_index/server/models/__init__.py so it’s publicly accessible.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (1)
python/llama-index-server/llama_index/server/utils/stream.py (1)

28-40: Use more specific type checking for better type safety.

The isinstance(res, AsyncGenerator) check was flagged in past reviews as not robust enough. It should check against the specific LlamaIndex type for better type safety.

-    if isinstance(res, AsyncGenerator):
+    if isinstance(res, CompletionResponseAsyncGen):

This ensures we're specifically handling LlamaIndex's streaming response type rather than any generic async generator.

🧹 Nitpick comments (1)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (1)

4-4: Remove unused import.

The AsyncGenerator import is not used in this file since the streaming logic has been moved to the utility function.

-from typing import List, Literal, Optional, AsyncGenerator
+from typing import List, Literal, Optional
🧰 Tools
🪛 Ruff (0.11.9)

4-4: typing.AsyncGenerator imported but unused

Remove unused import: typing.AsyncGenerator

(F401)

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b682e55 and c244605.

📒 Files selected for processing (2)
  • packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (3 hunks)
  • python/llama-index-server/llama_index/server/utils/stream.py (1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (4)
python/llama-index-server/llama_index/server/models/artifacts.py (5)
  • ArtifactEvent (58-66)
  • ArtifactType (13-15)
  • Artifact (36-55)
  • DocumentArtifactData (29-33)
  • DocumentArtifactSource (24-26)
python/llama-index-server/llama_index/server/models/chat.py (1)
  • ChatRequest (29-44)
python/llama-index-server/llama_index/server/models/source_nodes.py (1)
  • SourceNodesEvent (10-22)
python/llama-index-server/llama_index/server/utils/stream.py (1)
  • handle_llm_response_stream (10-45)
🪛 Ruff (0.11.9)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py

4-4: typing.AsyncGenerator imported but unused

Remove unused import: typing.AsyncGenerator

(F401)

⏰ Context from checks skipped due to timeout of 90000ms (57)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: lint
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
  • GitHub Check: Unit Tests (ubuntu-latest, 3.9)
🔇 Additional comments (5)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (4)

26-37: Well-organized imports for artifact functionality.

The new imports are properly structured and include all necessary classes for creating and emitting document artifacts.


380-380: Good use of the utility function for unified response handling.

Using handle_llm_response_stream provides a clean abstraction for handling both streaming and non-streaming responses, improving code maintainability.


382-400: Perfect implementation of artifact event creation.

This implementation exactly matches the pattern suggested by Marcus in the past review comments, creating a document artifact with proper metadata and source references.


403-403: Appropriate return value change.

Returning an empty string is consistent with the new approach where the actual content is delivered via the artifact event rather than the workflow result.

python/llama-index-server/llama_index/server/utils/stream.py (1)

10-45: Well-designed utility function with proper documentation.

The function provides a clean abstraction for handling both streaming and non-streaming responses, with good documentation and error handling through the chunk.delta or "" pattern.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (1)

4-4: Remove unused import.

The AsyncGenerator import is not directly used in this file since the type checking is handled internally by the write_response_to_stream utility function.

-from typing import List, Literal, Optional, AsyncGenerator
+from typing import List, Literal, Optional
🧰 Tools
🪛 Ruff (0.11.9)

4-4: typing.AsyncGenerator imported but unused

Remove unused import: typing.AsyncGenerator

(F401)

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c244605 and 9f062df.

📒 Files selected for processing (2)
  • packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (3 hunks)
  • python/llama-index-server/llama_index/server/utils/stream.py (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • python/llama-index-server/llama_index/server/utils/stream.py
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (3)
python/llama-index-server/llama_index/server/models/artifacts.py (5)
  • ArtifactEvent (58-66)
  • ArtifactType (13-15)
  • Artifact (36-55)
  • DocumentArtifactData (29-33)
  • DocumentArtifactSource (24-26)
python/llama-index-server/llama_index/server/models/source_nodes.py (1)
  • SourceNodesEvent (10-22)
python/llama-index-server/llama_index/server/utils/stream.py (1)
  • write_response_to_stream (10-45)
🪛 Ruff (0.11.9)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py

4-4: typing.AsyncGenerator imported but unused

Remove unused import: typing.AsyncGenerator

(F401)

⏰ Context from checks skipped due to timeout of 90000ms (57)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: Unit Tests (ubuntu-latest, 3.9)
  • GitHub Check: lint
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
🔇 Additional comments (3)
packages/create-llama/templates/components/use-cases/python/deep_research/workflow.py (3)

26-37: Well-structured imports for artifact functionality.

The imports are correctly organized and include all the necessary components for creating and emitting document artifacts. The addition of the write_response_to_stream utility function import aligns with the pattern of abstracting streaming complexity.


380-401: Excellent implementation of artifact event emission.

This implementation perfectly follows the pattern suggested in previous reviews by @marcusschiesser. The code correctly:

  • Uses the write_response_to_stream utility to handle both streaming and non-streaming responses
  • Creates an ArtifactEvent with proper document artifact structure
  • Includes all required metadata (timestamp, title, content, type, sources)
  • Maps context nodes to document sources appropriately

The abstraction of response handling complexity allows the workflow to focus on the business logic of creating and emitting the artifact.


403-403: Appropriate return value change.

Returning an empty string makes sense since the actual response content is now handled through the stream and artifact event, maintaining clean separation of concerns.

@thucpn thucpn merged commit af9ad3c into main Jun 6, 2025
60 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants