-
Notifications
You must be signed in to change notification settings - Fork 191
fix(templates): resolve workflow snapshot crash in nextjs server template #717
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
WalkthroughThis PR adds an .npmrc enabling legacy peer deps, updates several Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes
Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches🧪 Generate unit tests (beta)
📜 Recent review detailsConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc(1 hunks)packages/create-llama/templates/types/llamaindexserver/nextjs/package.json(1 hunks)
🧰 Additional context used
🧠 Learnings (14)
📓 Common learnings
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
📚 Learning: 2025-06-30T10:18:57.724Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/server.ts : The LlamaIndexServer class should be implemented in src/server.ts and serve as the main server implementation that wraps Next.js.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrcpackages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:39.549Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package configuration and binary entry point should be defined in `package.json`, with the binary pointing to `./dist/index.js`.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrcpackages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:26.711Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: Applies to packages/server/**/*.{ts,tsx} : TypeScript server code should be located in 'packages/server/' and use Next.js as the framework, with the core server logic implemented in a 'LlamaIndexServer' class.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrcpackages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:39.549Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : The package should support a `pack-install` script in `package.json` for creating and installing the local package for testing.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrcpackages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:26.711Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:26.711Z
Learning: CLI build artifacts and template caches should be cleaned using the 'npm run clean' script in 'packages/create-llama/'.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
📚 Learning: 2025-06-30T10:18:39.549Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/index.ts : The main CLI entry point should be implemented in `index.ts` using Commander.js for argument parsing.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
📚 Learning: 2025-06-30T10:18:39.549Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : Testing scripts for end-to-end, Python-specific, and TypeScript-specific templates should be defined in `package.json`.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrcpackages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:39.549Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/create-llama/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:39.549Z
Learning: Applies to packages/create-llama/**/package.json : Build scripts should be defined in `package.json` for building, cleaning, and developing the CLI.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrcpackages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:57.724Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/next/**/*.{js,jsx,ts,tsx} : Tailwind CSS should be used for styling UI components.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrc
📚 Learning: 2025-06-30T10:20:25.875Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: python/llama-index-server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:20:25.875Z
Learning: Applies to python/llama-index-server/**/pyproject.toml : Package configuration, dependencies, and build settings must be specified in pyproject.toml.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/.npmrcpackages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:18:57.724Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:18:57.724Z
Learning: Applies to packages/server/src/types.ts : Type definitions for WorkflowFactory, UIConfig, and LlamaIndexServerOptions should be implemented in src/types.ts.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:19:29.893Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/{simple-workflow/calculator.ts,agentic-rag/index.ts,custom-layout/index.ts,devmode/index.ts,src/app/workflow.ts} : Use the standard server setup pattern: instantiate `LlamaIndexServer` with `workflow`, `uiConfig`, and `port`, then call `.start()`.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
📚 Learning: 2025-06-30T10:19:29.893Z
Learnt from: CR
Repo: run-llama/create-llama PR: 0
File: packages/server/examples/CLAUDE.md:0-0
Timestamp: 2025-06-30T10:19:29.893Z
Learning: Applies to packages/server/examples/**/*.ts : Use the `agent()` function from `llamaindex/workflow` with tool arrays for agent creation.
Applied to files:
packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
🔇 Additional comments (3)
packages/create-llama/templates/types/llamaindexserver/nextjs/package.json (3)
13-18: Verify that loosening@llamaindex/toolsconstraint doesn't introduce regressions.The
@llamaindex/toolsconstraint is being broadened from~0.1.2to^0.1.0, which allows npm to resolve versions 0.1.0 and 0.1.1 that were previously excluded. While this likely resolves npm conflicts, it could reintroduce bugs that were fixed between 0.1.0 and 0.1.2.Confirm that 0.1.0 and 0.1.1 are compatible with your code and do not have known issues.
22-24: Theoverridesapproach correctly enforces workflow version consistency.Pinning
@llamaindex/workflowvia overrides ensures that all dependencies (including@llamaindex/server) use the fixed^1.1.24version, which resolves the snapshot.cjs crash. This is the correct way to handle the peer dependency conflict.
18-18: No action required—this is the official deprecation migration.The template's replacement of
llama-cloudwithllama-cloud-servicesfollows the officially documented migration path from the LlamaIndex/LlamaCloud team.llama-cloud-servicesis the designated successor to the deprecated@llamaindex/cloudpackage and is designed as a compatible replacement. Updating the dependency inpackage.jsonis the correct approach, and no additional code verification is necessary.
Description
Fixes a crash in the
llamaindex-server(Next.js) template where the generated app fails to start withError: Cannot find module ... snapshot.cjs.The Problem
@llamaindex/workflow(~1.1.8) which has a known issue withsnapshot.cjsresolution in certain environments.workflowto the latest version causes a peer dependency conflict with@llamaindex/server, which still requires the older version.The Fix
@llamaindex/workflowto^1.1.24(resolves the snapshot crash)."overrides"inpackage.jsonto force the server package to accept the new workflow version..npmrcwithlegacy-peer-deps=trueto the template to ensurenpm installsucceeds for new users without ERESOLVE errors.llama-clouddependency tollama-cloud-servicesto remove deprecation warnings.Verification
npx . --template llamaindex-server.npm installruns without error.npm run devstarts the server successfully.Summary by CodeRabbit
✏️ Tip: You can customize this high-level summary in your review settings.