Skip to content

Conversation

@thucpn
Copy link
Collaborator

@thucpn thucpn commented Jun 27, 2025

Summary by CodeRabbit

  • New Features

    • Introduced support for deployment-specific chat workflows, configurable by deployment and workflow settings.
    • Added a feature flag to enable the new chat workflow functionality.
  • Improvements

    • Centralized configuration for resource paths and deployment settings, making logo and script paths dynamic.
    • Enhanced error handling in chat UI for more consistent user feedback.
  • Dependency Updates

    • Updated chat UI package to the latest version for improved stability and features.

@thucpn thucpn requested a review from marcusschiesser June 27, 2025 07:30
@changeset-bot
Copy link

changeset-bot bot commented Jun 27, 2025

🦋 Changeset detected

Latest commit: 3665318

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 3 packages
Name Type
@llamaindex/server Patch
llamaindex-server-examples Patch
@create-llama/llama-index-server Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@coderabbitai
Copy link

coderabbitai bot commented Jun 27, 2025

Walkthrough

This update introduces deployment-specific configuration for the server package, enabling a new llamaDeploy option that adjusts routing, constants, and API endpoints based on deployment and workflow names. The client UI now conditionally uses a workflow-based chat handler. Constants for logo and script paths are centralized, and dependencies are updated.

Changes

Files/Groups Change Summary
.changeset/brown-readers-whisper.md Added patch note for new useChatWorkflow feature flag in @llamaindex/server.
packages/server/next/app/constants.ts Introduced constants: BASE_PATH, LLAMA_LOGO_URL, SCRIPT_PATH for centralized configuration.
packages/server/next/app/components/ui/chat/chat-avatar.tsx,
.../header.tsx
Replaced hardcoded image paths with LLAMA_LOGO_URL from constants module.
packages/server/next/app/components/ui/chat/chat-section.tsx Added conditional logic to use useChatWorkflow or standard chat handler based on deployment/workflow config.
packages/server/next/app/layout.tsx Script tag now uses SCRIPT_PATH from constants instead of a hardcoded path.
packages/server/package.json,
packages/server/project-config/package.json
Updated @llamaindex/chat-ui dependency from 0.5.6 to 0.5.12.
packages/server/src/types.ts Added LlamaDeployConfig type; updated UIConfig and LlamaIndexServerOptions to support deployment config.
packages/server/src/server.ts Added llamaDeploy option, validation, base path logic, and conditional routing/configuration for deployments.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant ChatSection
    participant ChatUI
    participant Server

    User->>ChatSection: Initiates chat
    ChatSection->>ChatSection: Check DEPLOYMENT and WORKFLOW config
    alt Deployment & Workflow set
        ChatSection->>ChatUI: Use workflow-based handler (useChatWorkflow)
    else
        ChatSection->>ChatUI: Use standard chat handler (useChat)
    end
    ChatUI->>Server: Send chat request (API endpoint may be prefixed)
    Server-->>ChatUI: Respond with chat result
Loading

Suggested reviewers

  • marcusschiesser
  • leehuwuj

Poem

In the meadow of code, a rabbit hops by,
With new deployments set, and workflows to try.
Constants now gathered, no more path fright,
Logos and scripts shining ever so bright.
Chat flows adapt as the server aligns—
Hopping ahead, through deployment-defined pines! 🐇


📜 Recent review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f31b609 and 3665318.

📒 Files selected for processing (1)
  • packages/server/package.json (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • packages/server/package.json
⏰ Context from checks skipped due to timeout of 90000ms (16)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, llamacloud)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, none)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, llamacloud)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, llamacloud)
  • GitHub Check: typescript (22, ubuntu-22.04, nextjs, llamacloud)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, none)
  • GitHub Check: typescript (22, windows-latest, nextjs, llamacloud)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, none)
  • GitHub Check: typescript (22, windows-latest, nextjs, none)
  • GitHub Check: typescript (22, ubuntu-22.04, nextjs, none)
  • GitHub Check: typescript (22, macos-latest, nextjs, none)
  • GitHub Check: typescript (22, macos-latest, nextjs, llamacloud)
  • GitHub Check: Unit Tests (ubuntu-latest, 3.9)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
  • GitHub Check: Type Check
  • GitHub Check: lint

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@thucpn thucpn marked this pull request as ready for review July 1, 2025 04:41
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🔭 Outside diff range comments (1)
packages/server/src/server.ts (1)

125-129: Replace deprecated fs.exists with fs.access.

fs.exists is deprecated. Also, using synchronous file operations could block the event loop.

-  private async createComponentsDir(componentsDir: string) {
-    const exists = await promisify(fs.exists)(componentsDir);
-    if (!exists) {
-      await promisify(fs.mkdir)(componentsDir);
-    }
-  }
+  private async createComponentsDir(componentsDir: string) {
+    try {
+      await promisify(fs.access)(componentsDir);
+    } catch {
+      await promisify(fs.mkdir)(componentsDir, { recursive: true });
+    }
+  }
♻️ Duplicate comments (1)
packages/server/project-config/package.json (1)

43-45: Same lock-file sync reminder

Project-level pin updated; make sure pnpm install has been run so lock-file and Docker layers stay in sync.

🧹 Nitpick comments (2)
packages/server/next/app/constants.ts (1)

1-11: Consider the implications of runtime BASE_PATH modification.

While the conditional logic for path construction is correct, having BASE_PATH as a mutable export that gets modified at runtime (as seen in server.ts) could lead to potential issues:

  1. In development with hot module replacement, the value might not update correctly
  2. Multiple server instances could have race conditions
  3. Testing could be affected by state persistence

Consider using environment variables or a configuration function instead of runtime file modification.

packages/server/src/server.ts (1)

66-66: Consider making the deployment path pattern configurable.

The hardcoded path pattern /deployments/${deployment}/ui might not suit all deployment scenarios. Consider making this configurable.

-    const basePath = `/deployments/${deployment}/ui`;
+    const basePath = process.env.LLAMA_DEPLOY_BASE_PATH_PATTERN
+      ? process.env.LLAMA_DEPLOY_BASE_PATH_PATTERN.replace('{deployment}', deployment)
+      : `/deployments/${deployment}/ui`;
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 952b5b4 and b314ee7.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (10)
  • .changeset/brown-readers-whisper.md (1 hunks)
  • packages/server/next/app/components/ui/chat/chat-avatar.tsx (1 hunks)
  • packages/server/next/app/components/ui/chat/chat-section.tsx (2 hunks)
  • packages/server/next/app/components/ui/chat/layout/header.tsx (2 hunks)
  • packages/server/next/app/constants.ts (1 hunks)
  • packages/server/next/app/layout.tsx (2 hunks)
  • packages/server/package.json (1 hunks)
  • packages/server/project-config/package.json (1 hunks)
  • packages/server/src/server.ts (4 hunks)
  • packages/server/src/types.ts (1 hunks)
🧰 Additional context used
📓 Path-based instructions (6)
`**/*.{ts,tsx}`: TypeScript code should be linted using ESLint and formatted with Prettier, as enforced by 'pnpm lint' and 'pnpm format' at the root level.

**/*.{ts,tsx}: TypeScript code should be linted using ESLint and formatted with Prettier, as enforced by 'pnpm lint' and 'pnpm format' at the root level.

📄 Source: CodeRabbit Inference Engine (CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/next/app/components/ui/chat/layout/header.tsx
  • packages/server/next/app/components/ui/chat/chat-avatar.tsx
  • packages/server/next/app/layout.tsx
  • packages/server/src/types.ts
  • packages/server/next/app/constants.ts
  • packages/server/next/app/components/ui/chat/chat-section.tsx
  • packages/server/src/server.ts
`packages/server/**/*.{ts,tsx}`: TypeScript server code should be located in 'pa...

packages/server/**/*.{ts,tsx}: TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.

📄 Source: CodeRabbit Inference Engine (CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/next/app/components/ui/chat/layout/header.tsx
  • packages/server/next/app/components/ui/chat/chat-avatar.tsx
  • packages/server/next/app/layout.tsx
  • packages/server/src/types.ts
  • packages/server/next/app/constants.ts
  • packages/server/next/app/components/ui/chat/chat-section.tsx
  • packages/server/src/server.ts
`packages/server/next/**/*.{js,jsx,ts,tsx}`: UI components for the chat interfac...

packages/server/next/**/*.{js,jsx,ts,tsx}: UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Tailwind CSS should be used for styling UI components.

📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/next/app/components/ui/chat/layout/header.tsx
  • packages/server/next/app/components/ui/chat/chat-avatar.tsx
  • packages/server/next/app/layout.tsx
  • packages/server/next/app/constants.ts
  • packages/server/next/app/components/ui/chat/chat-section.tsx
`packages/server/**/*.{ts,tsx}`: TypeScript should be used throughout the codebase for type safety.

packages/server/**/*.{ts,tsx}: TypeScript should be used throughout the codebase for type safety.

📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/next/app/components/ui/chat/layout/header.tsx
  • packages/server/next/app/components/ui/chat/chat-avatar.tsx
  • packages/server/next/app/layout.tsx
  • packages/server/src/types.ts
  • packages/server/next/app/constants.ts
  • packages/server/next/app/components/ui/chat/chat-section.tsx
  • packages/server/src/server.ts
`packages/server/src/types.ts`: Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.

packages/server/src/types.ts: Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.

📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/src/types.ts
`packages/server/src/server.ts`: The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.

packages/server/src/server.ts: The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.

📄 Source: CodeRabbit Inference Engine (packages/server/CLAUDE.md)

List of files the instruction was applied to:

  • packages/server/src/server.ts
🧠 Learnings (11)
📓 Common learnings
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Workflow factory functions should accept a ChatRequest and return a Workflow instance, following the documented contract.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.702Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
.changeset/brown-readers-whisper.md (5)
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint should be implemented in api/routers/chat.py and support streaming responses, message format conversion, background tasks, and optional LlamaCloud integration.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Workflow factory functions should accept a ChatRequest and return a Workflow instance, following the documented contract.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
packages/server/next/app/components/ui/chat/layout/header.tsx (10)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.702Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/layout/**/*.tsx : Place custom React layout components in the `layout/` directory, e.g., `layout/header.tsx`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/layout/**/* : Use shared layout components across workflows for layout consistency.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
packages/server/project-config/package.json (5)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
packages/server/next/app/components/ui/chat/chat-avatar.tsx (10)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.702Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/create-app.ts : Core application creation logic and orchestration should be implemented in `create-app.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
packages/server/package.json (11)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/llama_index/server/resources/**/* : Bundled UI assets should be included in llama_index/server/resources for package distribution.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.702Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : Tailwind CSS should be used for styling UI components.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/api/routers/chat.py : The /api/chat endpoint must support streaming responses compatible with Vercel, background tasks for file downloads, and LlamaCloud integration if enabled.
packages/server/next/app/layout.tsx (10)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/layout/**/*.tsx : Place custom React layout components in the `layout/` directory, e.g., `layout/header.tsx`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/custom-layout/layout/** : Exclude runtime React components in `custom-layout/layout` from TypeScript compilation output.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : Tailwind CSS should be used for styling UI components.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.702Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/public/config.js : Static assets, including client-side config, should be placed in the public/ directory (e.g., public/config.js).
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/{data,output}/** : Data and output folders for file integration should be mounted and served as static assets via Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
packages/server/src/types.ts (13)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/llama_index/server/__init__.py : Package exports, including LlamaIndexServer, UIConfig, and UIEvent, should be defined in llama_index/server/__init__.py.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.702Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/create-app.ts : Core application creation logic and orchestration should be implemented in `create-app.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
packages/server/next/app/constants.ts (14)
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.702Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/create-app.ts : Core application creation logic and orchestration should be implemented in `create-app.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/public/config.js : Static assets, including client-side config, should be placed in the public/ directory (e.g., public/config.js).
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.702Z
Learning: CLI build artifacts and template caches should be cleaned using the 'npm run clean' script in 'packages/create-llama/'.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.702Z
Learning: Applies to packages/create-llama/templates/**/* : Templates for the CLI should be organized under 'packages/create-llama/templates/', with 'types/' for base project structures and 'components/' for reusable framework components.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/package.json : Build scripts should be defined in `package.json` for building, cleaning, and developing the CLI.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/helpers/**/* : Utility functions for package management, file operations, and configuration should be placed in the `helpers/` directory.
packages/server/next/app/components/ui/chat/chat-section.tsx (13)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/handlers/chat.ts : The handleChat function should process POST requests to /api/chat and be implemented in src/handlers/chat.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the workflow factory pattern for workflow creation, i.e., define `workflowFactory` as a function returning an agent instance, optionally async.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : Tailwind CSS should be used for styling UI components.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/utils/workflow.ts : The runWorkflow function should execute workflows with proper event handling and be implemented in src/utils/workflow.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/utils/gen-ui.ts : The generateEventComponent function, responsible for using LLMs to auto-generate React components, should be implemented in src/utils/gen-ui.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Workflow factory functions should accept a ChatRequest and return a Workflow instance, following the documented contract.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/layout/**/*.tsx : Place custom React layout components in the `layout/` directory, e.g., `layout/header.tsx`.
packages/server/src/server.ts (21)
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.702Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/src/app/workflow*.ts : Organize workflow files separately in development mode, e.g., `src/app/workflow.ts`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: The LlamaIndexServer should be configured using the workflow_factory parameter, with environment and UI configuration options as shown in the provided example.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/llama_index/server/server.py : The main LlamaIndexServer class should be implemented in llama_index/server/server.py and extend FastAPI.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/events.ts : Event system logic, including source, agent, and artifact events, as well as helper functions for converting LlamaIndex data to UI events, should be implemented in src/events.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/**/*.{ts,tsx} : Demonstrate proper async/await patterns and error handling for LLM operations.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Configure UI with options such as `starterQuestions`, `layoutDir`, `devMode`, and `suggestNextQuestions` in the server setup.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : UI components for the chat interface, including message history, streaming responses, canvas panel, and custom layouts, should be implemented in the next/ directory using shadcn/ui components and Tailwind CSS.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/**/*.ts : Use the `agent()` function from `@llamaindex/workflow` with tool arrays for agent creation.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/index.ts : The CLI should accept command-line options for framework selection, template type, model providers, vector databases, data sources, tools, and observability options.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/questions/**/* : Interactive prompts for user configuration should be implemented in the `questions/` directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.539Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Applies to python/llama-index-server/{components,layout}/**/* : Custom UI components should be placed in the components/ directory, and custom layout sections in the layout/ directory.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/next/api/{chat,components,layout,files/**}/route.ts : API route handlers for chat, components, layout, and file serving should be implemented in next/api/{chat,components,layout,files/**}/route.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.715Z
Learning: Applies to packages/server/src/handlers/chat.ts : The handleChat function should process POST requests to /api/chat and be implemented in src/handlers/chat.ts.
Learnt from: CR
PR: run-llama/create-llama#0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.864Z
Learning: Workflow factory functions should accept a ChatRequest and return a Workflow instance, following the documented contract.
Learnt from: CR
PR: run-llama/create-llama#0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.871Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the workflow factory pattern for workflow creation, i.e., define `workflowFactory` as a function returning an agent instance, optionally async.
🧬 Code Graph Analysis (5)
packages/server/next/app/components/ui/chat/layout/header.tsx (1)
packages/server/next/app/constants.ts (1)
  • LLAMA_LOGO_URL (4-6)
packages/server/next/app/components/ui/chat/chat-avatar.tsx (1)
packages/server/next/app/constants.ts (1)
  • LLAMA_LOGO_URL (4-6)
packages/server/next/app/layout.tsx (1)
packages/server/next/app/constants.ts (1)
  • SCRIPT_PATH (9-11)
packages/server/src/types.ts (1)
python/llama-index-server/llama_index/server/server.py (1)
  • UIConfig (22-76)
packages/server/next/app/components/ui/chat/chat-section.tsx (1)
packages/server/next/app/components/ui/lib/utils.ts (1)
  • getConfig (10-13)
⏰ Context from checks skipped due to timeout of 90000ms (14)
  • GitHub Check: typescript (22, windows-latest, nextjs, llamacloud)
  • GitHub Check: lint
  • GitHub Check: typescript (22, macos-latest, nextjs, llamacloud)
  • GitHub Check: typescript (22, macos-latest, nextjs, none)
  • GitHub Check: typescript (22, windows-latest, nextjs, none)
  • GitHub Check: typescript (22, ubuntu-22.04, nextjs, none)
  • GitHub Check: typescript (22, ubuntu-22.04, nextjs, llamacloud)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, none)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, llamacloud)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, llamacloud)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, llamacloud)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, none)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, none)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
🔇 Additional comments (7)
packages/server/package.json (1)

70-72: Confirm lockfile regeneration

  • No other package.json in the monorepo pins @llamaindex/chat-ui@0.5.6.
  • Please run pnpm install (or your preferred workflow) to regenerate pnpm-lock.yaml and commit the updated lockfile so CI will pick up version 0.5.11.
.changeset/brown-readers-whisper.md (1)

1-6: Changeset header LGTM

Valid package name, correct semver bump, concise summary.

packages/server/next/app/components/ui/chat/layout/header.tsx (1)

4-5: Centralised asset path – good call

Replacing the hard-coded string with LLAMA_LOGO_URL removes duplication and honours BASE_PATH in multi-tenant deployments.

Also applies to: 25-26

packages/server/next/app/components/ui/chat/chat-avatar.tsx (1)

4-5: Consistent logo constant

Same improvement as in Header; keeps asset resolution DRY.

Also applies to: 11-12

packages/server/next/app/layout.tsx (1)

7-7: Good refactoring to centralize configuration.

Using the imported SCRIPT_PATH constant instead of hardcoding the path improves maintainability and supports deployment-specific configurations.

Also applies to: 25-25

packages/server/src/types.ts (1)

15-28: Well-structured type definitions for the new deployment feature.

The new LlamaDeployConfig type and the modifications to UIConfig properly support the deployment configuration feature. Making workflow optional in LlamaIndexServerOptions aligns with the conditional logic in the server implementation.

packages/server/next/app/components/ui/chat/chat-section.tsx (1)

40-44: Verify useChatWorkflow API compatibility

It looks like you’re calling useChatWorkflow (from @llamaindex/chat-ui) with a different signature than useChat (from ai/react). Before merging, please confirm that:

  • useChatWorkflow actually accepts the same option names and callback hooks you need (e.g. onError, any streaming callbacks, etc.).
  • Its return values (messages array, loading state, send function) line up with how you’re consuming useChatHandler in the fallback branch.
  • If there are extra required callbacks on useChatWorkflow (such as onStream or onResponse), add them here or adjust your logic accordingly.

File to review:
packages/server/next/app/components/ui/chat/chat-section.tsx (around lines 40–44)

@thucpn thucpn merged commit 52cc37f into main Jul 1, 2025
18 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants