Skip to content

Conversation

@EnthusiAD
Copy link

@EnthusiAD EnthusiAD commented Dec 5, 2025

Description

Fixes a crash in the llamaindex-server (Next.js) template where the generated app fails to start with Error: Cannot find module ... snapshot.cjs.

The Problem

  • The template was using an older version of @llamaindex/workflow (~1.1.8) which has a known issue with snapshot.cjs resolution in certain environments.
  • Updating workflow to the latest version causes a peer dependency conflict with @llamaindex/server, which still requires the older version.

The Fix

  1. Updated @llamaindex/workflow to ^1.1.24 (resolves the snapshot crash).
  2. Added "overrides" in package.json to force the server package to accept the new workflow version.
  3. Added .npmrc with legacy-peer-deps=true to the template to ensure npm install succeeds for new users without ERESOLVE errors.
  4. Updated llama-cloud dependency to llama-cloud-services to remove deprecation warnings.

Verification

  • Generated a new app using npx . --template llamaindex-server.
  • Verified npm install runs without error.
  • Verified npm run dev starts the server successfully.

Summary by CodeRabbit

  • Chores
    • Updated several core library dependencies to newer versions for improved compatibility and features.
    • Added a new cloud services dependency.
    • Enabled legacy peer dependency handling in npm config to smooth installations and dependency resolution.

✏️ Tip: You can customize this high-level summary in your review settings.

@changeset-bot
Copy link

changeset-bot bot commented Dec 5, 2025

⚠️ No Changeset found

Latest commit: 82077f2

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@coderabbitai
Copy link

coderabbitai bot commented Dec 5, 2025

Walkthrough

This PR adds an .npmrc enabling legacy peer deps, updates several @llamaindex and llamaindex dependency versions, adds a llama-cloud-services dependency, and introduces an overrides entry pinning @llamaindex/workflow to ^1.1.24 in the Next.js llamaindex-server template.

Changes

Cohort / File(s) Summary
npm configuration
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
Added legacy-peer-deps=true to allow installation without enforcing peer dependency compatibility.
Dependency updates
packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
Bumped versions: @llamaindex/openai ~0.4.0→^0.5.0, @llamaindex/server ^0.3.0→^0.4.0, @llamaindex/workflow ~1.1.8→^1.1.24, @llamaindex/tools ~0.1.2→^0.1.0, llamaindex ~0.11.0→^0.12.0; added llama-cloud-services ^0.1.0; added overrides pinning @llamaindex/workflow to ^1.1.24.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

  • Check compatibility across the bumped @llamaindex packages and transitive peers.
  • Verify rationale and compatibility for adding llama-cloud-services.
  • Confirm the overrides entry doesn't mask incompatible transitive versions.

Possibly related PRs

Suggested reviewers

  • marcusschiesser
  • leehuwuj

Poem

🐰 Hop-hop, a patch in my paw,

peers relaxed for installer law,
versions raised, new cloud in sight,
templates tuned for morning light,
I nibble bugs and dance in delight. 🎋

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main objective: fixing a workflow snapshot crash in the Next.js server template through dependency updates and configuration changes.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

📜 Recent review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 83ece77 and 82077f2.

📒 Files selected for processing (1)
  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, llamacloud)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, none)
  • GitHub Check: Unit Tests (windows-latest, 3.9)

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 97a7d9b and 83ece77.

📒 Files selected for processing (2)
  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc (1 hunks)
  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json (1 hunks)
🧰 Additional context used
🧠 Learnings (14)
📓 Common learnings
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
📚 Learning: 2025-06-30T10:18:57.724Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:39.549Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:26.711Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:39.549Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package should support a `pack-install` script in `package.json` for creating and installing the local package for testing.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:26.711Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: CLI build artifacts and template caches should be cleaned using the 'npm run clean' script in 'packages/create-llama/'.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
📚 Learning: 2025-06-30T10:18:39.549Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
📚 Learning: 2025-06-30T10:18:39.549Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : Testing scripts for end-to-end, Python-specific, and TypeScript-specific templates should be defined in `package.json`.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:39.549Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : Build scripts should be defined in `package.json` for building, cleaning, and developing the CLI.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:57.724Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : Tailwind CSS should be used for styling UI components.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
📚 Learning: 2025-06-30T10:20:25.875Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:57.724Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:19:29.893Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:19:29.893Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.ts : Use the `agent()` function from `llamaindex/workflow` with tool arrays for agent creation.

Applied to files:

  • packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
🔇 Additional comments (3)
packages/create-llama/templates/types/llamaindexserver/nextjs/package.json (3)

13-18: Verify that loosening @llamaindex/tools constraint doesn't introduce regressions.

The @llamaindex/tools constraint is being broadened from ~0.1.2 to ^0.1.0, which allows npm to resolve versions 0.1.0 and 0.1.1 that were previously excluded. While this likely resolves npm conflicts, it could reintroduce bugs that were fixed between 0.1.0 and 0.1.2.

Confirm that 0.1.0 and 0.1.1 are compatible with your code and do not have known issues.


22-24: The overrides approach correctly enforces workflow version consistency.

Pinning @llamaindex/workflow via overrides ensures that all dependencies (including @llamaindex/server) use the fixed ^1.1.24 version, which resolves the snapshot.cjs crash. This is the correct way to handle the peer dependency conflict.


18-18: No action required—this is the official deprecation migration.

The template's replacement of llama-cloud with llama-cloud-services follows the officially documented migration path from the LlamaIndex/LlamaCloud team. llama-cloud-services is the designated successor to the deprecated @llamaindex/cloud package and is designed as a compatible replacement. Updating the dependency in package.json is the correct approach, and no additional code verification is necessary.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants