-
Notifications
You must be signed in to change notification settings - Fork 191
feat: support file server for python llamadeploy #703
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🦋 Changeset detectedLatest commit: b390da3 The changes in this PR will be included in the next version bump. This PR includes changesets to release 3 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
WalkthroughThis change introduces a configurable file server mechanism for the Python llamadeploy environment, adds support for LlamaCloud file retrieval in the file API, and exposes a Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant ChatUI
participant Server (API)
participant LlamaCloud
User->>ChatUI: Requests a file via chat
ChatUI->>Server (API): GET /api/files/[...slug]
Server (API)->>Server (API): Check local file existence
alt File not found locally & LlamaCloud enabled
Server (API)->>LlamaCloud: Request file URL (with pipeline_id, file_name)
LlamaCloud-->>Server (API): Return file download URL
Server (API)->>Server (API): Download file and save locally
Server (API)->>Server (API): Check file existence again
end
alt File found locally
Server (API)-->>ChatUI: Serve file
else
Server (API)-->>ChatUI: 404 Not Found
end
Suggested reviewers
Poem
📜 Recent review detailsConfiguration used: .coderabbit.yaml ⛔ Files ignored due to path filters (1)
📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (16)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Nitpick comments (2)
packages/server/next/app/api/files/helpers.ts (1)
59-92: LGTM: Well-implemented downloadFile function with proper error handling.The implementation is robust with several good practices:
- HTTP status code validation
- Directory creation with recursive option
- Proper error handling and cleanup
- Promise-based approach for better async handling
This is an improvement over the existing implementation in
packages/server/src/utils/file.ts(lines 5-32) which lacks HTTP status code validation.Consider these optional enhancements for production robustness:
export async function downloadFile( urlToDownload: string, downloadedPath: string, ): Promise<void> { + // Validate URL format + if (!urlToDownload.startsWith('https://')) { + throw new Error('Only HTTPS URLs are supported'); + } + return new Promise((resolve, reject) => { const dir = path.dirname(downloadedPath); fs.mkdirSync(dir, { recursive: true }); const file = fs.createWriteStream(downloadedPath); - https + const request = https .get(urlToDownload, (response) => { if (response.statusCode !== 200) { reject( new Error(`Failed to download file: Status ${response.statusCode}`), ); return; } // ... rest of implementation }) .on("error", (err) => { fs.unlink(downloadedPath, () => reject(err)); }); + + // Set timeout to prevent hanging downloads + request.setTimeout(30000, () => { + request.destroy(); + fs.unlink(downloadedPath, () => reject(new Error('Download timeout'))); + }); }); }packages/server/src/server.ts (1)
117-120: Consider URL construction safety and validation.The file server URL construction looks correct, but consider adding validation to ensure the URL is well-formed.
Apply this diff to add URL validation:
// construct file server url for LlamaDeploy // eg. for Non-LlamaCloud: localhost:3000/deployments/chat/ui/api/files/data // eg. for LlamaCloud: localhost:3000/deployments/chat/ui/api/files/output/llamacloud + if (this.serverUrl && !this.serverUrl.startsWith('http')) { + console.warn('serverUrl should include protocol (http/https)'); + } const fileServerUrl = `${this.serverUrl}${basePath}/api/files/${this.fileServer}`;
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (8)
.changeset/wise-ways-knock.md(1 hunks)packages/server/next/app/api/files/[...slug]/route.ts(1 hunks)packages/server/next/app/api/files/helpers.ts(2 hunks)packages/server/next/app/components/ui/chat/chat-section.tsx(1 hunks)packages/server/next/app/components/ui/chat/chat-starter.tsx(1 hunks)packages/server/package.json(1 hunks)packages/server/src/server.ts(4 hunks)packages/server/src/types.ts(1 hunks)
🧰 Additional context used
📓 Path-based instructions (6)
`**/*.{ts,tsx}`: TypeScript code should be linted using ESLint and formatted with Prettier, as enforced by 'pnpm lint' and 'pnpm format' at the root level.
**/*.{ts,tsx}: TypeScript code should be linted using ESLint and formatted with Prettier, as enforced by 'pnpm lint' and 'pnpm format' at the root level.
📄 Source: CodeRabbit Inference Engine (CLAUDE.md)
List of files the instruction was applied to:
packages/server/src/types.tspackages/server/next/app/components/ui/chat/chat-section.tsxpackages/server/next/app/api/files/helpers.tspackages/server/next/app/components/ui/chat/chat-starter.tsxpackages/server/next/app/api/files/[...slug]/route.tspackages/server/src/server.ts
`packages/server/**/*.{ts,tsx}`: TypeScript server code should be located in 'pa...
packages/server/**/*.{ts,tsx}: TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
📄 Source: CodeRabbit Inference Engine (CLAUDE.md)
List of files the instruction was applied to:
packages/server/src/types.tspackages/server/next/app/components/ui/chat/chat-section.tsxpackages/server/next/app/api/files/helpers.tspackages/server/next/app/components/ui/chat/chat-starter.tsxpackages/server/next/app/api/files/[...slug]/route.tspackages/server/src/server.ts
`packages/server/src/types.ts`: Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
packages/server/src/types.ts: Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)
List of files the instruction was applied to:
packages/server/src/types.ts
`packages/server/**/*.{ts,tsx}`: TypeScript should be used throughout the codebase for type safety.
packages/server/**/*.{ts,tsx}: TypeScript should be used throughout the codebase for type safety.
📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)
List of files the instruction was applied to:
packages/server/src/types.tspackages/server/next/app/components/ui/chat/chat-section.tsxpackages/server/next/app/api/files/helpers.tspackages/server/next/app/components/ui/chat/chat-starter.tsxpackages/server/next/app/api/files/[...slug]/route.tspackages/server/src/server.ts
`packages/server/next/**/*.{js,jsx,ts,tsx}`: UI components for the chat interfac...
packages/server/next/**/*.{js,jsx,ts,tsx}: UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Tailwind CSS should be used for styling UI components.
📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)
List of files the instruction was applied to:
packages/server/next/app/components/ui/chat/chat-section.tsxpackages/server/next/app/api/files/helpers.tspackages/server/next/app/components/ui/chat/chat-starter.tsxpackages/server/next/app/api/files/[...slug]/route.ts
`packages/server/src/server.ts`: The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
packages/server/src/server.ts: The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)
List of files the instruction was applied to:
packages/server/src/server.ts
🧠 Learnings (9)
📓 Common learnings
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{data,output}/**/* : Static files from data/ and output/ directories must be served at /api/files/data/* and /api/files/output/* endpoints.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/resources/**/* : Static assets and bundled UI files should be placed in the resources/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/.ui/**/* : Downloaded UI static files should be placed in the .ui/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/.env : Environment variables should be managed using .env files for API keys and configuration.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
packages/server/package.json (9)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : Tailwind CSS should be used for styling UI components.
packages/server/src/types.ts (14)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/events.ts : Event system logic, including source, agent, and artifact events, as well as helper functions for converting LlamaIndex data to UI events, should be implemented in src/events.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript should be used throughout the codebase for type safety.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
packages/server/next/app/components/ui/chat/chat-section.tsx (11)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/handlers/chat.ts : The handleChat function should process POST requests to /api/chat and be implemented in src/handlers/chat.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the workflow factory pattern for workflow creation, i.e., define `workflowFactory` as a function returning an agent instance, optionally async.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Workflow factory functions should accept a ChatRequest and return a Workflow instance, following the documented contract.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/workflow.ts : The runWorkflow function should execute workflows with proper event handling and be implemented in src/utils/workflow.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.ts : Use the `agent()` function from `@llamaindex/workflow` with tool arrays for agent creation.
packages/server/next/app/api/files/helpers.ts (6)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.{ts,tsx} : Demonstrate proper async/await patterns and error handling for LLM operations.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.ts : Use the `agent()` function from `@llamaindex/workflow` with tool arrays for agent creation.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/{data,output}/** : Data and output folders for file integration should be mounted and served as static assets via Next.js.
.changeset/wise-ways-knock.md (12)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{data,output}/**/* : Static files from data/ and output/ directories must be served at /api/files/data/* and /api/files/output/* endpoints.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*test*.py : Python unit tests should use pytest and provide comprehensive API and service coverage.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/test_*.py : Tests should be implemented using pytest, pytest-asyncio, and pytest-mock.
packages/server/next/app/components/ui/chat/chat-starter.tsx (3)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
packages/server/next/app/api/files/[...slug]/route.ts (6)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.{ts,tsx} : Demonstrate proper async/await patterns and error handling for LLM operations.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/{data,output}/** : Data and output folders for file integration should be mounted and served as static assets via Next.js.
packages/server/src/server.ts (28)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/layout/**/*.tsx : Place custom React layout components in the `layout/` directory, e.g., `layout/header.tsx`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : Tailwind CSS should be used for styling UI components.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/public/config.js : Static assets, including client-side config, should be placed in the public/ directory (e.g., public/config.js).
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/custom-layout/layout/** : Exclude runtime React components in `custom-layout/layout` from TypeScript compilation output.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/{data,output}/** : Data and output folders for file integration should be mounted and served as static assets via Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/questions/**/* : Interactive prompts for user configuration should be implemented in the `questions/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{components,layout}/**/* : Custom UI components should be placed in the components/ directory, and custom layout sections in the layout/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/create-app.ts : Core application creation logic and orchestration should be implemented in `create-app.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/.ui/**/* : Downloaded UI static files should be placed in the .ui/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/resources/**/* : Static assets and bundled UI files should be placed in the resources/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{data,output}/**/* : Static files from data/ and output/ directories must be served at /api/files/data/* and /api/files/output/* endpoints.
🧬 Code Graph Analysis (3)
packages/server/next/app/components/ui/chat/chat-section.tsx (1)
packages/server/next/app/components/ui/lib/utils.ts (1)
getConfig(10-13)
packages/server/next/app/api/files/helpers.ts (1)
packages/server/src/utils/file.ts (1)
downloadFile(5-32)
packages/server/next/app/components/ui/chat/chat-starter.tsx (1)
packages/server/next/app/components/ui/lib/utils.ts (1)
getConfig(10-13)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (14)
- GitHub Check: typescript (22, macos-latest, nextjs, llamacloud)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, none)
- GitHub Check: typescript (22, ubuntu-22.04, nextjs, none)
- GitHub Check: typescript (22, windows-latest, nextjs, llamacloud)
- GitHub Check: typescript (22, ubuntu-22.04, nextjs, llamacloud)
- GitHub Check: lint
- GitHub Check: python (20, 3.11, macos-latest, fastapi, llamacloud)
- GitHub Check: typescript (22, windows-latest, nextjs, none)
- GitHub Check: typescript (22, macos-latest, nextjs, none)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, llamacloud)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, none)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, llamacloud)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, none)
- GitHub Check: Unit Tests (windows-latest, 3.9)
🔇 Additional comments (12)
packages/server/next/app/api/files/helpers.ts (1)
3-3: LGTM: HTTPS import added for file download functionality.The HTTPS module import is correctly added to support the new download functionality.
.changeset/wise-ways-knock.md (1)
1-6: LGTM: Changeset properly documents the file server feature addition.The changeset correctly identifies this as a patch release and provides a clear description of the new file server functionality for Python llamadeploy.
packages/server/next/app/components/ui/chat/chat-section.tsx (1)
41-41: FILE_SERVER_URL configuration verified in server setupThe
FILE_SERVER_URLkey is emitted inpackages/server/src/server.tswhen writing out the config, confirming it’s available forgetConfig("FILE_SERVER_URL").LGTM—no further action needed.
packages/server/src/types.ts (2)
28-28: LGTM: serverUrl property added to UIConfig type.The optional
serverUrlproperty is correctly typed and maintains backward compatibility.
34-34: LGTM: fileServer property added to LlamaIndexServerOptions type.The optional
fileServerproperty is correctly typed and follows the established pattern for configuration options.packages/server/next/app/components/ui/chat/chat-starter.tsx (1)
9-15: LGTM! Improved defensive programming.The refactoring enhances robustness by explicitly validating the configuration value before use. The separation of concerns (fetching config vs. using config) and the explicit array validation make the code more maintainable and less prone to runtime errors.
packages/server/next/app/api/files/[...slug]/route.ts (2)
2-2: LGTM! Appropriate imports for LlamaCloud integration.The imports are properly structured and support the new file server functionality.
Also applies to: 5-5
49-49: LGTM! Proper file existence check after download attempt.The file existence check is correctly placed after the LlamaCloud download logic to ensure newly downloaded files are detected.
packages/server/src/server.ts (4)
26-27: LGTM! Appropriate new properties for file server functionality.The new
serverUrlandfileServerproperties are well-positioned and support the enhanced file server capabilities.
39-43: LGTM! Solid initialization logic with appropriate defaults.The initialization logic correctly handles:
serverUrldefaulting to empty string (using current host)fileServerpath selection based on LlamaCloud availability- Proper fallback to sensible defaults
51-58: LGTM! Improved validation for llamaDeploy configuration.The expanded validation correctly prevents incompatible features from being used together with LlamaDeploy. The error message is clear and informative.
134-135: LGTM! Proper exposure of file server URL to client.The
FILE_SERVER_URLconfiguration is correctly added to the client-side configuration object, enabling frontend components to access the file server.
Summary by CodeRabbit
New Features
Bug Fixes
Refactor
Chores