Skip to content

Conversation

@thucpn
Copy link
Collaborator

@thucpn thucpn commented Jul 10, 2025

Summary by CodeRabbit

  • New Features

    • Added support for file server integration in Python llamadeploy environments.
    • Enabled file retrieval from LlamaCloud when files are not found locally, using a new environment flag.
    • Exposed a new file server URL in the client configuration for improved file handling.
  • Bug Fixes

    • Improved validation logic for incompatible configuration options in llamadeploy environments.
  • Refactor

    • Enhanced starter question initialization for more robust configuration handling.
  • Chores

    • Updated the "@llamaindex/chat-ui" dependency to version 0.5.16.

@thucpn thucpn requested a review from marcusschiesser July 10, 2025 03:16
@changeset-bot
Copy link

changeset-bot bot commented Jul 10, 2025

🦋 Changeset detected

Latest commit: b390da3

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 3 packages
Name Type
@llamaindex/server Patch
llamaindex-server-examples Patch
@create-llama/llama-index-server Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@coderabbitai
Copy link

coderabbitai bot commented Jul 10, 2025

Walkthrough

This change introduces a configurable file server mechanism for the Python llamadeploy environment, adds support for LlamaCloud file retrieval in the file API, and exposes a FILE_SERVER_URL to the frontend. It updates dependency versions, enhances configuration options, and improves type safety and validation logic throughout the server and UI components.

Changes

File(s) Change Summary
.changeset/wise-ways-knock.md Documents new file server support for llamadeploy in the changelog.
packages/server/next/app/api/files/[...slug]/route.ts, helpers.ts Adds LlamaCloud file retrieval logic, a helper to download files, and pipeline/file name parsing.
packages/server/next/app/components/ui/chat/chat-section.tsx Passes fileServerUrl config to chat workflow hook.
packages/server/next/app/components/ui/chat/chat-starter.tsx Refactors starter questions initialization with improved array/type checks.
packages/server/package.json Updates @llamaindex/chat-ui dependency from 0.5.12 to 0.5.16.
packages/server/src/server.ts, src/types.ts Adds serverUrl and fileServer to server config/types; exposes FILE_SERVER_URL to client config; improves validation.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant ChatUI
    participant Server (API)
    participant LlamaCloud

    User->>ChatUI: Requests a file via chat
    ChatUI->>Server (API): GET /api/files/[...slug]
    Server (API)->>Server (API): Check local file existence
    alt File not found locally & LlamaCloud enabled
        Server (API)->>LlamaCloud: Request file URL (with pipeline_id, file_name)
        LlamaCloud-->>Server (API): Return file download URL
        Server (API)->>Server (API): Download file and save locally
        Server (API)->>Server (API): Check file existence again
    end
    alt File found locally
        Server (API)-->>ChatUI: Serve file
    else
        Server (API)-->>ChatUI: 404 Not Found
    end
Loading

Suggested reviewers

  • marcusschiesser

Poem

In the warren of code, a new tunnel appears,
File servers now hop where llamas draw near.
Cloud or local, the files will be found,
With URLs and helpers, they bounce all around.
The chat grows more clever, the config more bright—
A rabbit’s delight on a servery night! 🐇✨


📜 Recent review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e5a442e and b390da3.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (1)
  • packages/server/package.json (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • packages/server/package.json
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (16)
  • GitHub Check: typescript (22, ubuntu-22.04, nextjs, llamacloud)
  • GitHub Check: typescript (22, windows-latest, nextjs, none)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, llamacloud)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, llamacloud)
  • GitHub Check: typescript (22, ubuntu-22.04, nextjs, none)
  • GitHub Check: typescript (22, windows-latest, nextjs, llamacloud)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, llamacloud)
  • GitHub Check: typescript (22, macos-latest, nextjs, none)
  • GitHub Check: typescript (22, macos-latest, nextjs, llamacloud)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, none)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, none)
  • GitHub Check: lint
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, none)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
  • GitHub Check: Unit Tests (ubuntu-latest, 3.9)
  • GitHub Check: Type Check

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (2)
packages/server/next/app/api/files/helpers.ts (1)

59-92: LGTM: Well-implemented downloadFile function with proper error handling.

The implementation is robust with several good practices:

  • HTTP status code validation
  • Directory creation with recursive option
  • Proper error handling and cleanup
  • Promise-based approach for better async handling

This is an improvement over the existing implementation in packages/server/src/utils/file.ts (lines 5-32) which lacks HTTP status code validation.

Consider these optional enhancements for production robustness:

 export async function downloadFile(
   urlToDownload: string,
   downloadedPath: string,
 ): Promise<void> {
+  // Validate URL format
+  if (!urlToDownload.startsWith('https://')) {
+    throw new Error('Only HTTPS URLs are supported');
+  }
+
   return new Promise((resolve, reject) => {
     const dir = path.dirname(downloadedPath);
     fs.mkdirSync(dir, { recursive: true });
     const file = fs.createWriteStream(downloadedPath);

-    https
+    const request = https
       .get(urlToDownload, (response) => {
         if (response.statusCode !== 200) {
           reject(
             new Error(`Failed to download file: Status ${response.statusCode}`),
           );
           return;
         }
         // ... rest of implementation
       })
       .on("error", (err) => {
         fs.unlink(downloadedPath, () => reject(err));
       });
+
+    // Set timeout to prevent hanging downloads
+    request.setTimeout(30000, () => {
+      request.destroy();
+      fs.unlink(downloadedPath, () => reject(new Error('Download timeout')));
+    });
   });
 }
packages/server/src/server.ts (1)

117-120: Consider URL construction safety and validation.

The file server URL construction looks correct, but consider adding validation to ensure the URL is well-formed.

Apply this diff to add URL validation:

   // construct file server url for LlamaDeploy
   // eg. for Non-LlamaCloud: localhost:3000/deployments/chat/ui/api/files/data
   // eg. for LlamaCloud: localhost:3000/deployments/chat/ui/api/files/output/llamacloud
+  if (this.serverUrl && !this.serverUrl.startsWith('http')) {
+    console.warn('serverUrl should include protocol (http/https)');
+  }
   const fileServerUrl = `${this.serverUrl}${basePath}/api/files/${this.fileServer}`;
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2b85420 and e5a442e.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (8)
  • .changeset/wise-ways-knock.md (1 hunks)
  • packages/server/next/app/api/files/[...slug]/route.ts (1 hunks)
  • packages/server/next/app/api/files/helpers.ts (2 hunks)
  • packages/server/next/app/components/ui/chat/chat-section.tsx (1 hunks)
  • packages/server/next/app/components/ui/chat/chat-starter.tsx (1 hunks)
  • packages/server/package.json (1 hunks)
  • packages/server/src/server.ts (4 hunks)
  • packages/server/src/types.ts (1 hunks)
🧰 Additional context used
📓 Path-based instructions (6)
`**/*.{ts,tsx}`: TypeScript code should be linted using ESLint and formatted with Prettier, as enforced by 'pnpm lint' and 'pnpm format' at the root level.

**/*.{ts,tsx}: TypeScript code should be linted using ESLint and formatted with Prettier, as enforced by 'pnpm lint' and 'pnpm format' at the root level.

📄 Source: CodeRabbit Inference Engine (CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/src/types.ts
  • packages/server/next/app/components/ui/chat/chat-section.tsx
  • packages/server/next/app/api/files/helpers.ts
  • packages/server/next/app/components/ui/chat/chat-starter.tsx
  • packages/server/next/app/api/files/[...slug]/route.ts
  • packages/server/src/server.ts
`packages/server/**/*.{ts,tsx}`: TypeScript server code should be located in 'pa...

packages/server/**/*.{ts,tsx}: TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.

📄 Source: CodeRabbit Inference Engine (CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/src/types.ts
  • packages/server/next/app/components/ui/chat/chat-section.tsx
  • packages/server/next/app/api/files/helpers.ts
  • packages/server/next/app/components/ui/chat/chat-starter.tsx
  • packages/server/next/app/api/files/[...slug]/route.ts
  • packages/server/src/server.ts
`packages/server/src/types.ts`: Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.

packages/server/src/types.ts: Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.

📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/src/types.ts
`packages/server/**/*.{ts,tsx}`: TypeScript should be used throughout the codebase for type safety.

packages/server/**/*.{ts,tsx}: TypeScript should be used throughout the codebase for type safety.

📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/src/types.ts
  • packages/server/next/app/components/ui/chat/chat-section.tsx
  • packages/server/next/app/api/files/helpers.ts
  • packages/server/next/app/components/ui/chat/chat-starter.tsx
  • packages/server/next/app/api/files/[...slug]/route.ts
  • packages/server/src/server.ts
`packages/server/next/**/*.{js,jsx,ts,tsx}`: UI components for the chat interfac...

packages/server/next/**/*.{js,jsx,ts,tsx}: UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Tailwind CSS should be used for styling UI components.

📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/next/app/components/ui/chat/chat-section.tsx
  • packages/server/next/app/api/files/helpers.ts
  • packages/server/next/app/components/ui/chat/chat-starter.tsx
  • packages/server/next/app/api/files/[...slug]/route.ts
`packages/server/src/server.ts`: The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.

packages/server/src/server.ts: The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.

📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/src/server.ts
🧠 Learnings (9)
📓 Common learnings
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{data,output}/**/* : Static files from data/ and output/ directories must be served at /api/files/data/* and /api/files/output/* endpoints.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/resources/**/* : Static assets and bundled UI files should be placed in the resources/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/.ui/**/* : Downloaded UI static files should be placed in the .ui/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/.env : Environment variables should be managed using .env files for API keys and configuration.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
packages/server/package.json (9)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : Tailwind CSS should be used for styling UI components.
packages/server/src/types.ts (14)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/events.ts : Event system logic, including source, agent, and artifact events, as well as helper functions for converting LlamaIndex data to UI events, should be implemented in src/events.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript should be used throughout the codebase for type safety.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
packages/server/next/app/components/ui/chat/chat-section.tsx (11)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/handlers/chat.ts : The handleChat function should process POST requests to /api/chat and be implemented in src/handlers/chat.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the workflow factory pattern for workflow creation, i.e., define `workflowFactory` as a function returning an agent instance, optionally async.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Workflow factory functions should accept a ChatRequest and return a Workflow instance, following the documented contract.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/workflow.ts : The runWorkflow function should execute workflows with proper event handling and be implemented in src/utils/workflow.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.ts : Use the `agent()` function from `@llamaindex/workflow` with tool arrays for agent creation.
packages/server/next/app/api/files/helpers.ts (6)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.{ts,tsx} : Demonstrate proper async/await patterns and error handling for LLM operations.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.ts : Use the `agent()` function from `@llamaindex/workflow` with tool arrays for agent creation.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/{data,output}/** : Data and output folders for file integration should be mounted and served as static assets via Next.js.
.changeset/wise-ways-knock.md (12)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{data,output}/**/* : Static files from data/ and output/ directories must be served at /api/files/data/* and /api/files/output/* endpoints.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*test*.py : Python unit tests should use pytest and provide comprehensive API and service coverage.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/test_*.py : Tests should be implemented using pytest, pytest-asyncio, and pytest-mock.
packages/server/next/app/components/ui/chat/chat-starter.tsx (3)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
packages/server/next/app/api/files/[...slug]/route.ts (6)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.{ts,tsx} : Demonstrate proper async/await patterns and error handling for LLM operations.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/{data,output}/** : Data and output folders for file integration should be mounted and served as static assets via Next.js.
packages/server/src/server.ts (28)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/layout/**/*.tsx : Place custom React layout components in the `layout/` directory, e.g., `layout/header.tsx`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : Tailwind CSS should be used for styling UI components.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/public/config.js : Static assets, including client-side config, should be placed in the public/ directory (e.g., public/config.js).
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/custom-layout/layout/** : Exclude runtime React components in `custom-layout/layout` from TypeScript compilation output.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/{data,output}/** : Data and output folders for file integration should be mounted and served as static assets via Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/questions/**/* : Interactive prompts for user configuration should be implemented in the `questions/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{components,layout}/**/* : Custom UI components should be placed in the components/ directory, and custom layout sections in the layout/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/create-app.ts : Core application creation logic and orchestration should be implemented in `create-app.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/.ui/**/* : Downloaded UI static files should be placed in the .ui/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/resources/**/* : Static assets and bundled UI files should be placed in the resources/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{data,output}/**/* : Static files from data/ and output/ directories must be served at /api/files/data/* and /api/files/output/* endpoints.
🧬 Code Graph Analysis (3)
packages/server/next/app/components/ui/chat/chat-section.tsx (1)
packages/server/next/app/components/ui/lib/utils.ts (1)
  • getConfig (10-13)
packages/server/next/app/api/files/helpers.ts (1)
packages/server/src/utils/file.ts (1)
  • downloadFile (5-32)
packages/server/next/app/components/ui/chat/chat-starter.tsx (1)
packages/server/next/app/components/ui/lib/utils.ts (1)
  • getConfig (10-13)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (14)
  • GitHub Check: typescript (22, macos-latest, nextjs, llamacloud)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, none)
  • GitHub Check: typescript (22, ubuntu-22.04, nextjs, none)
  • GitHub Check: typescript (22, windows-latest, nextjs, llamacloud)
  • GitHub Check: typescript (22, ubuntu-22.04, nextjs, llamacloud)
  • GitHub Check: lint
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, llamacloud)
  • GitHub Check: typescript (22, windows-latest, nextjs, none)
  • GitHub Check: typescript (22, macos-latest, nextjs, none)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, llamacloud)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, none)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, llamacloud)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, none)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
🔇 Additional comments (12)
packages/server/next/app/api/files/helpers.ts (1)

3-3: LGTM: HTTPS import added for file download functionality.

The HTTPS module import is correctly added to support the new download functionality.

.changeset/wise-ways-knock.md (1)

1-6: LGTM: Changeset properly documents the file server feature addition.

The changeset correctly identifies this as a patch release and provides a clear description of the new file server functionality for Python llamadeploy.

packages/server/next/app/components/ui/chat/chat-section.tsx (1)

41-41: FILE_SERVER_URL configuration verified in server setup

The FILE_SERVER_URL key is emitted in packages/server/src/server.ts when writing out the config, confirming it’s available for getConfig("FILE_SERVER_URL").

LGTM—no further action needed.

packages/server/src/types.ts (2)

28-28: LGTM: serverUrl property added to UIConfig type.

The optional serverUrl property is correctly typed and maintains backward compatibility.


34-34: LGTM: fileServer property added to LlamaIndexServerOptions type.

The optional fileServer property is correctly typed and follows the established pattern for configuration options.

packages/server/next/app/components/ui/chat/chat-starter.tsx (1)

9-15: LGTM! Improved defensive programming.

The refactoring enhances robustness by explicitly validating the configuration value before use. The separation of concerns (fetching config vs. using config) and the explicit array validation make the code more maintainable and less prone to runtime errors.

packages/server/next/app/api/files/[...slug]/route.ts (2)

2-2: LGTM! Appropriate imports for LlamaCloud integration.

The imports are properly structured and support the new file server functionality.

Also applies to: 5-5


49-49: LGTM! Proper file existence check after download attempt.

The file existence check is correctly placed after the LlamaCloud download logic to ensure newly downloaded files are detected.

packages/server/src/server.ts (4)

26-27: LGTM! Appropriate new properties for file server functionality.

The new serverUrl and fileServer properties are well-positioned and support the enhanced file server capabilities.


39-43: LGTM! Solid initialization logic with appropriate defaults.

The initialization logic correctly handles:

  • serverUrl defaulting to empty string (using current host)
  • fileServer path selection based on LlamaCloud availability
  • Proper fallback to sensible defaults

51-58: LGTM! Improved validation for llamaDeploy configuration.

The expanded validation correctly prevents incompatible features from being used together with LlamaDeploy. The error message is clear and informative.


134-135: LGTM! Proper exposure of file server URL to client.

The FILE_SERVER_URL configuration is correctly added to the client-side configuration object, enabling frontend components to access the file server.

@thucpn thucpn requested a review from marcusschiesser July 10, 2025 09:08
@thucpn thucpn merged commit 91ce4e1 into main Jul 10, 2025
18 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants