-
Notifications
You must be signed in to change notification settings - Fork 191
refactor: llamacloud configs #707
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🦋 Changeset detectedLatest commit: e2163de The changes in this PR will be included in the next version bump. This PR includes changesets to release 4 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
WalkthroughThis update refactors and extends LlamaCloud integration across server, templates, and documentation. It introduces a structured Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant Server
participant LlamaIndexServer
participant LlamaCloud
User->>Server: Sends request (e.g., chat, file API)
Server->>LlamaIndexServer: Initializes with llamaCloud config
alt File API with llamaCloud enabled
Server->>Server: Injects useLlamaCloud=true query param
Server->>LlamaCloud: Retrieves file using outputDir config
LlamaCloud-->>Server: Returns file data
end
alt Chat request
Server->>LlamaIndexServer: handleChat(..., llamaCloudOutputDir)
LlamaIndexServer->>LlamaCloud: Uses outputDir for source events
end
Server-->>User: Responds with data
Possibly related PRs
Suggested reviewers
Poem
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (1)
packages/server/src/utils/events.ts (1)
44-52: Parameter-based configuration is a valid architectural choice.While environment variables were suggested in the past review, the parameter-based approach has advantages:
- Explicit dependencies make the code more testable
- Avoids global state and improves modularity
- Maintains backward compatibility with the default value
The implementation correctly propagates the configuration through the call stack.
🧹 Nitpick comments (6)
.changeset/gorgeous-squids-run.md (1)
2-6: Add a clearer, human-readable summary line.
refactor: llamacloud configsis a bit terse. The changeset file is used for release notes, so expanding this sentence (e.g. “Refactor LlamaCloud configuration from boolean flag to structured object”) will make the changelog far more helpful.packages/create-llama/templates/components/use-cases/python/document_generator/README-template.md (1)
92-99: Document all publicllamaCloudoptions for consistency.The server docs list both
outputDirandindexSelectoron thellamaCloudobject, but this README only mentionsoutputDir. OmittingindexSelectorcreates a mismatch that can confuse users.- The following are the available options: - - `outputDir`: The directory for LlamaCloud output +The following are the available options: +- `outputDir`: Directory where LlamaCloud stores downloaded files +- `indexSelector` (boolean, optional): Show the LlamaCloud index-selector dropdown in the chat UIpackages/create-llama/templates/types/llamaindexserver/nextjs/src/llamacloud/index.ts (1)
8-17: Handle the promise returned by.start()to avoid unhandled rejections.
LlamaIndexServer.start()is async and returns a promise.
Creating the server inline without awaiting / catching means any startup failure will be swallowed by Node’s unhandled-rejection handler (or crash the process in future Node versions with--unhandled-rejections=strict).- new LlamaIndexServer({ +const server = new LlamaIndexServer({ workflow: workflowFactory, uiConfig: { componentsDir: "components", devMode: true, }, llamaCloud: { outputDir: "output/llamacloud", }, -}).start(); +}); + +server.start().catch((err) => { + console.error("Failed to start LlamaIndexServer:", err); + process.exit(1); +});packages/server/CLAUDE.md (1)
133-135: Minor wording tweak for clarity.Consider rephrasing to make the object hierarchy explicit:
- `llamaCloud`: An object to configure the LlamaCloud integration containing the following properties: + `llamaCloud`: Object configuring LlamaCloud integration. Properties:Purely editorial—no functional impact.
packages/create-llama/templates/components/use-cases/python/agentic_rag/README-template.md (1)
98-105: Mirror the fullllamaCloudoption set.Like the previous README, this section omits the
indexSelectorflag that exists in the main server docs.-The following are the available options: -- `outputDir`: The directory for LlamaCloud output +The following are the available options: +- `outputDir`: Directory where LlamaCloud stores downloaded files +- `indexSelector` (boolean, optional): Show the LlamaCloud index-selector dropdown in the chat UIpackages/server/src/server.ts (1)
198-205: Consider a more explicit approach for LlamaCloud file handling.The current implementation adds a query parameter to signal LlamaCloud usage. Consider passing the configuration more explicitly:
if ( pathname?.includes("/api/files") && req.method === "GET" && this.llamaCloud ) { - query.useLlamaCloud = "true"; + query.llamaCloudOutputDir = this.llamaCloud.outputDir; }This would make the configuration more explicit and avoid boolean string conversions.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (17)
.changeset/gorgeous-squids-run.md(1 hunks)packages/create-llama/helpers/python.ts(1 hunks)packages/create-llama/helpers/typescript.ts(1 hunks)packages/create-llama/templates/components/ts-proxy/llamacloud/index.ts(1 hunks)packages/create-llama/templates/components/use-cases/python/agentic_rag/README-template.md(1 hunks)packages/create-llama/templates/components/use-cases/python/code_generator/README-template.md(1 hunks)packages/create-llama/templates/components/use-cases/python/deep_research/README-template.md(1 hunks)packages/create-llama/templates/components/use-cases/python/document_generator/README-template.md(1 hunks)packages/create-llama/templates/types/llamaindexserver/nextjs/src/llamacloud/index.ts(1 hunks)packages/server/CLAUDE.md(1 hunks)packages/server/README.md(1 hunks)packages/server/next/app/api/files/[...slug]/route.ts(1 hunks)packages/server/src/handlers/chat.ts(2 hunks)packages/server/src/server.ts(7 hunks)packages/server/src/types.ts(1 hunks)packages/server/src/utils/events.ts(2 hunks)packages/server/src/utils/workflow.ts(2 hunks)
🧰 Additional context used
📓 Path-based instructions (10)
packages/create-llama/templates/**/*
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- CLAUDE.md
- packages/create-llama/CLAUDE.md
**/*.{ts,tsx}
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- CLAUDE.md
packages/create-llama/**/index.ts
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- packages/create-llama/CLAUDE.md
packages/server/**/*.{ts,tsx}
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- CLAUDE.md
- packages/server/CLAUDE.md
packages/server/next/**/*.{js,jsx,ts,tsx}
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- packages/server/CLAUDE.md
packages/server/src/utils/workflow.ts
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- packages/server/CLAUDE.md
packages/server/src/handlers/chat.ts
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- packages/server/CLAUDE.md
packages/create-llama/helpers/**/*
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- packages/create-llama/CLAUDE.md
packages/server/src/types.ts
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- packages/server/CLAUDE.md
packages/server/src/server.ts
Instructions used from:
Sources:
📄 CodeRabbit Inference Engine
- packages/server/CLAUDE.md
🧠 Learnings (18)
📓 Common learnings
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/.env : Environment variables should be managed using .env files for API keys and configuration.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/create-llama/templates/**/* : Templates for the CLI should be organized under 'packages/create-llama/templates/', with 'types/' for base project structures and 'components/' for reusable framework components.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
.changeset/gorgeous-squids-run.md (10)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package should support a `pack-install` script in `package.json` for creating and installing the local package for testing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/create-app.ts : Core application creation logic and orchestration should be implemented in `create-app.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/.env : Environment variables should be managed using .env files for API keys and configuration.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/helpers/**/* : Utility functions for package management, file operations, and configuration should be placed in the `helpers/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: CLI build artifacts and template caches should be cleaned using the 'npm run clean' script in 'packages/create-llama/'.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/create-llama/templates/**/* : Templates for the CLI should be organized under 'packages/create-llama/templates/', with 'types/' for base project structures and 'components/' for reusable framework components.
packages/create-llama/templates/components/use-cases/python/agentic_rag/README-template.md (16)
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/gen_ui/**/* : AI-powered UI component generation system should be implemented within the gen_ui/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/gen_ui/main.py : AI-powered component generation using LLM workflows should be implemented in gen_ui/main.py, including GenUIWorkflow, planning, aggregation, code generation, and validation.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{components,layout}/**/* : Custom UI components should be placed in the components/ directory, and custom layout sections in the layout/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{examples,docs}/**/*.{ipynb,md} : Jupyter notebooks and markdown files should be used for examples and documentation.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/components/**/* : Structure custom UI components in dedicated directories.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/questions/**/* : Interactive prompts for user configuration should be implemented in the `questions/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/create-llama/templates/**/* : Templates for the CLI should be organized under 'packages/create-llama/templates/', with 'types/' for base project structures and 'components/' for reusable framework components.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
packages/create-llama/templates/components/use-cases/python/document_generator/README-template.md (16)
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/gen_ui/**/* : AI-powered UI component generation system should be implemented within the gen_ui/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{examples,docs}/**/*.{ipynb,md} : Jupyter notebooks and markdown files should be used for examples and documentation.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/gen_ui/main.py : AI-powered component generation using LLM workflows should be implemented in gen_ui/main.py, including GenUIWorkflow, planning, aggregation, code generation, and validation.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/tools/**/* : Document generation, interpreter tools, and index querying utilities should be implemented within the tools/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{components,layout}/**/* : Custom UI components should be placed in the components/ directory, and custom layout sections in the layout/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Changes to templates require rebuilding the CLI and should be validated with end-to-end tests.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/questions/**/* : Interactive prompts for user configuration should be implemented in the `questions/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/components/**/* : Structure custom UI components in dedicated directories.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/events.ts : Event system logic, including source, agent, and artifact events, as well as helper functions for converting LlamaIndex data to UI events, should be implemented in src/events.ts.
packages/server/CLAUDE.md (16)
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/questions/**/* : Interactive prompts for user configuration should be implemented in the `questions/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{components,layout}/**/* : Custom UI components should be placed in the components/ directory, and custom layout sections in the layout/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/components/**/* : Structure custom UI components in dedicated directories.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
packages/create-llama/templates/components/use-cases/python/deep_research/README-template.md (16)
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/gen_ui/**/* : AI-powered UI component generation system should be implemented within the gen_ui/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/gen_ui/main.py : AI-powered component generation using LLM workflows should be implemented in gen_ui/main.py, including GenUIWorkflow, planning, aggregation, code generation, and validation.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{components,layout}/**/* : Custom UI components should be placed in the components/ directory, and custom layout sections in the layout/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/components/**/* : Structure custom UI components in dedicated directories.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{examples,docs}/**/*.{ipynb,md} : Jupyter notebooks and markdown files should be used for examples and documentation.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/questions/**/* : Interactive prompts for user configuration should be implemented in the `questions/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/create-llama/templates/**/* : Templates for the CLI should be organized under 'packages/create-llama/templates/', with 'types/' for base project structures and 'components/' for reusable framework components.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
packages/create-llama/templates/types/llamaindexserver/nextjs/src/llamacloud/index.ts (11)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/create-app.ts : Core application creation logic and orchestration should be implemented in `create-app.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
packages/create-llama/templates/components/ts-proxy/llamacloud/index.ts (12)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/create-app.ts : Core application creation logic and orchestration should be implemented in `create-app.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
packages/server/next/app/api/files/[...slug]/route.ts (9)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{data,output}/**/* : Static files from data/ and output/ directories must be served at /api/files/data/* and /api/files/output/* endpoints.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.{ts,tsx} : Demonstrate proper async/await patterns and error handling for LLM operations.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/.env : Environment variables should be managed using .env files for API keys and configuration.
packages/server/README.md (16)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{examples,docs}/**/*.{ipynb,md} : Jupyter notebooks and markdown files should be used for examples and documentation.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/questions/**/* : Interactive prompts for user configuration should be implemented in the `questions/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{components,layout}/**/* : Custom UI components should be placed in the components/ directory, and custom layout sections in the layout/ directory.
packages/server/src/utils/workflow.ts (7)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/workflow.ts : The runWorkflow function should execute workflows with proper event handling and be implemented in src/utils/workflow.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the workflow factory pattern for workflow creation, i.e., define `workflowFactory` as a function returning an agent instance, optionally async.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.ts : Use the `agent()` function from `@llamaindex/workflow` with tool arrays for agent creation.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/events.ts : Event system logic, including source, agent, and artifact events, as well as helper functions for converting LlamaIndex data to UI events, should be implemented in src/events.ts.
packages/server/src/handlers/chat.ts (14)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/handlers/chat.ts : The handleChat function should process POST requests to /api/chat and be implemented in src/handlers/chat.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint should be implemented in api/routers/chat.py and support streaming responses, message format conversion, background tasks, and optional LlamaCloud integration.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Workflow factory functions should accept a ChatRequest and return a Workflow instance, following the documented contract.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/workflow.ts : The runWorkflow function should execute workflows with proper event handling and be implemented in src/utils/workflow.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the workflow factory pattern for workflow creation, i.e., define `workflowFactory` as a function returning an agent instance, optionally async.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.ts : Use the `agent()` function from `@llamaindex/workflow` with tool arrays for agent creation.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.{ts,tsx} : Demonstrate proper async/await patterns and error handling for LLM operations.
packages/create-llama/helpers/typescript.ts (14)
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/create-app.ts : Core application creation logic and orchestration should be implemented in `create-app.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: CLI build artifacts and template caches should be cleaned using the 'npm run clean' script in 'packages/create-llama/'.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/create-llama/templates/**/* : Templates for the CLI should be organized under 'packages/create-llama/templates/', with 'types/' for base project structures and 'components/' for reusable framework components.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : Testing scripts for end-to-end, Python-specific, and TypeScript-specific templates should be defined in `package.json`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/events.ts : Event system logic, including source, agent, and artifact events, as well as helper functions for converting LlamaIndex data to UI events, should be implemented in src/events.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/create-llama/e2e/**/* : Playwright end-to-end tests should be placed in 'packages/create-llama/e2e/' and validate both Python and TypeScript generated projects.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.{ts,tsx} : Demonstrate proper async/await patterns and error handling for LLM operations.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/templates/**/* : Project templates for different frameworks and use cases should be stored in the `templates/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Changes to templates require rebuilding the CLI and should be validated with end-to-end tests.
packages/create-llama/templates/components/use-cases/python/code_generator/README-template.md (16)
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/gen_ui/**/* : AI-powered UI component generation system should be implemented within the gen_ui/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/gen_ui/main.py : AI-powered component generation using LLM workflows should be implemented in gen_ui/main.py, including GenUIWorkflow, planning, aggregation, code generation, and validation.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{examples,docs}/**/*.{ipynb,md} : Jupyter notebooks and markdown files should be used for examples and documentation.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/tools/**/* : Document generation, interpreter tools, and index querying utilities should be implemented within the tools/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/{components,layout}/**/* : Custom UI components should be placed in the components/ directory, and custom layout sections in the layout/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/questions/**/* : Interactive prompts for user configuration should be implemented in the `questions/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/components/**/* : Structure custom UI components in dedicated directories.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/create-llama/templates/**/* : Templates for the CLI should be organized under 'packages/create-llama/templates/', with 'types/' for base project structures and 'components/' for reusable framework components.
packages/server/src/types.ts (13)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/events.ts : Event system logic, including source, agent, and artifact events, as well as helper functions for converting LlamaIndex data to UI events, should be implemented in src/events.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/questions/**/* : Interactive prompts for user configuration should be implemented in the `questions/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
packages/create-llama/helpers/python.ts (17)
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/create-app.ts : Core application creation logic and orchestration should be implemented in `create-app.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/create-llama/e2e/**/* : Playwright end-to-end tests should be placed in 'packages/create-llama/e2e/' and validate both Python and TypeScript generated projects.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : Testing scripts for end-to-end, Python-specific, and TypeScript-specific templates should be defined in `package.json`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package should support a `pack-install` script in `package.json` for creating and installing the local package for testing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/create-llama/templates/**/* : Templates for the CLI should be organized under 'packages/create-llama/templates/', with 'types/' for base project structures and 'components/' for reusable framework components.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.{ts,tsx} : Demonstrate proper async/await patterns and error handling for LLM operations.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/.ui/**/* : Downloaded UI static files should be placed in the .ui/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : Tailwind CSS should be used for styling UI components.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: CLI build artifacts and template caches should be cleaned using the 'npm run clean' script in 'packages/create-llama/'.
packages/server/src/utils/events.ts (12)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/events.ts : Event system logic, including source, agent, and artifact events, as well as helper functions for converting LlamaIndex data to UI events, should be implemented in src/events.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/.env : Environment variables should be managed using .env files for API keys and configuration.
Learnt from: thucpn
PR: run-llama/create-llama#0
File: :0-0
Timestamp: 2024-07-26T21:06:39.705Z
Learning: For the AstraDB integration in `create-llama`, errors related to missing environment variables in `checkRequiredEnvVars` are intended to be thrown to the server API, not handled by exiting the process.
Learnt from: thucpn
PR: run-llama/create-llama#0
File: :0-0
Timestamp: 2024-10-16T13:04:24.943Z
Learning: For the AstraDB integration in `create-llama`, errors related to missing environment variables in `checkRequiredEnvVars` are intended to be thrown to the server API, not handled by exiting the process.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.ts : Use the `agent()` function from `@llamaindex/workflow` with tool arrays for agent creation.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/api/models.py : Structured event types for workflow communication, including UIEvent, ArtifactEvent, SourceNodesEvent, and AgentRunEvent, should be defined in api/models.py using Pydantic data models.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/utils/workflow.ts : The runWorkflow function should execute workflows with proper event handling and be implemented in src/utils/workflow.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the workflow factory pattern for workflow creation, i.e., define `workflowFactory` as a function returning an agent instance, optionally async.
packages/server/src/server.ts (22)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to python/llama-index-server/**/*.py : Python server code should be located in 'python/llama-index-server/' and use FastAPI, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/.env : Environment variables should be managed using .env files for API keys and configuration.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/services/**/* : Business logic for file handling, LlamaCloud integration, and UI generation should be implemented within the services/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.{ts,tsx} : Demonstrate proper async/await patterns and error handling for LLM operations.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/handlers/chat.ts : The handleChat function should process POST requests to /api/chat and be implemented in src/handlers/chat.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: thucpn
PR: run-llama/create-llama#0
File: :0-0
Timestamp: 2024-07-26T21:06:39.705Z
Learning: For the AstraDB integration in `create-llama`, errors related to missing environment variables in `checkRequiredEnvVars` are intended to be thrown to the server API, not handled by exiting the process.
Learnt from: thucpn
PR: run-llama/create-llama#0
File: :0-0
Timestamp: 2024-10-16T13:04:24.943Z
Learning: For the AstraDB integration in `create-llama`, errors related to missing environment variables in `checkRequiredEnvVars` are intended to be thrown to the server API, not handled by exiting the process.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/questions/**/* : Interactive prompts for user configuration should be implemented in the `questions/` directory.
🧬 Code Graph Analysis (7)
packages/create-llama/templates/types/llamaindexserver/nextjs/src/llamacloud/index.ts (1)
packages/create-llama/templates/types/llamaindexserver/nextjs/src/app/settings.ts (1)
initSettings(4-11)
packages/server/src/utils/workflow.ts (1)
packages/server/src/utils/events.ts (1)
toSourceEvent(65-76)
packages/server/src/handlers/chat.ts (1)
packages/server/src/utils/workflow.ts (1)
processWorkflowStream(65-115)
packages/create-llama/helpers/typescript.ts (1)
packages/create-llama/helpers/copy.ts (1)
copy(13-49)
packages/server/src/types.ts (2)
python/llama-index-server/llama_index/server/services/llamacloud/index.py (1)
LlamaCloudConfig(22-69)python/llama-index-server/llama_index/server/server.py (1)
UIConfig(22-76)
packages/create-llama/helpers/python.ts (1)
packages/create-llama/helpers/copy.ts (1)
copy(13-49)
packages/server/src/server.ts (2)
packages/server/src/types.ts (1)
LlamaCloudConfig(30-33)python/llama-index-server/llama_index/server/services/llamacloud/index.py (1)
LlamaCloudConfig(22-69)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (15)
- GitHub Check: lint
- GitHub Check: typescript (22, macos-latest, nextjs, llamacloud)
- GitHub Check: typescript (22, ubuntu-22.04, nextjs, none)
- GitHub Check: typescript (22, macos-latest, nextjs, none)
- GitHub Check: typescript (22, ubuntu-22.04, nextjs, llamacloud)
- GitHub Check: Unit Tests (windows-latest, 3.9)
- GitHub Check: typescript (22, windows-latest, nextjs, none)
- GitHub Check: typescript (22, windows-latest, nextjs, llamacloud)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, none)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, llamacloud)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, llamacloud)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, llamacloud)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, none)
- GitHub Check: Unit Tests (ubuntu-latest, 3.9)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, none)
🔇 Additional comments (12)
packages/create-llama/templates/components/use-cases/python/deep_research/README-template.md (1)
98-105: Documentation update aligns well with the LlamaCloud refactor.The new dedicated "LlamaCloud Integration" section provides clear guidance on the restructured configuration. The separation from general UI options improves clarity and user experience.
packages/create-llama/templates/components/use-cases/python/code_generator/README-template.md (1)
91-98: Consistent documentation update across use case templates.The LlamaCloud Integration section matches the pattern established in other templates, maintaining consistency in user documentation.
packages/create-llama/templates/components/ts-proxy/llamacloud/index.ts (1)
1-12: Server setup follows established patterns correctly.The implementation properly uses the new
llamaCloudconfiguration object and follows the standard server instantiation pattern. The configuration values appear reasonable for a LlamaCloud-enabled deployment.packages/server/README.md (1)
69-72: Clear documentation of the new LlamaCloud configuration structure.The updated documentation properly explains the transition from a boolean flag to a structured configuration object, with clear explanations of both
outputDirandindexSelectorproperties.packages/server/src/handlers/chat.ts (2)
24-24: LGTM: Clean parameter addition for LlamaCloud configuration.The optional
llamaCloudOutputDirparameter follows TypeScript conventions and integrates well with the broader LlamaCloud refactoring.
57-60: LGTM: Proper parameter propagation to workflow processing.The
llamaCloudOutputDirparameter is correctly passed toprocessWorkflowStream, maintaining the chain of configuration propagation from server options through to workflow utilities.packages/create-llama/helpers/typescript.ts (1)
137-147: LGTM: Well-structured LlamaCloud file promotion logic.The conditional logic correctly promotes the LlamaCloud-specific
index.tsfile to the main source directory and cleans up the temporary subdirectory. The async file operations are properly awaited.Consider that file operations could fail (e.g., if files don't exist or permissions issues), but the current approach should propagate errors appropriately through the async chain.
packages/server/src/utils/workflow.ts (2)
67-67: LGTM: Consistent parameter addition for LlamaCloud configuration.The optional
llamaCloudOutputDirparameter maintains consistency with the broader refactoring and follows TypeScript conventions.
94-97: LGTM: Proper parameter forwarding to source event creation.The
llamaCloudOutputDirparameter is correctly passed totoSourceEvent, ensuring the configuration flows through to where source nodes are processed. This aligns with the relevant code snippet showingtoSourceEventexpects this parameter.packages/create-llama/helpers/python.ts (1)
442-453: LGTM: Refined copying logic for LlamaCloud integration.The updated approach is more precise than copying the entire
ts-proxydirectory. The selective copying ofpackage.jsonfollowed by conditionalindex.tscopying based on theuseLlamaParseflag aligns well with the LlamaCloud configuration refactoring and provides better control over which files are included.packages/server/src/types.ts (2)
30-33: LGTM: Well-designed structured configuration type.The
LlamaCloudConfigtype is well-structured with appropriate required/optional properties. The requiredoutputDirmakes sense for file operations, while the optionalindexSelectormaintains flexibility.
40-40: LGTM: Clean integration into server options.The optional
llamaCloudproperty maintains backward compatibility while enabling the new structured configuration approach.
Summary by CodeRabbit
New Features
Bug Fixes
Documentation
Refactor