Skip to content

Conversation

@thucpn
Copy link
Collaborator

@thucpn thucpn commented Jun 4, 2025

Duplicate Dependency Resolution Issue:
The examples package explicitly listed @llamaindex/workflow@1.1.3 in its dependencies, while @llamaindex/server@workspace:* (linked to packages/server) also depended on @llamaindex/workflow@1.1.3. pnpm resolved these as separate instances of @llama-flow/core@0.4.1 and causing an instanceof mismatch

To detect issue: cd packages\server\examples && pnpm why @llama-flow/core

@llamaindex/server link:..
└─┬ @llamaindex/workflow 1.1.3
  └── @llama-flow/core 0.4.1
  
@llamaindex/workflow 1.1.3
└── @llama-flow/core 0.4.1

llamaindex 0.11.1
└─┬ @llamaindex/cloud 4.0.9
  └── @llama-flow/core 0.4.1 peer

This PR is to remove @llamaindex/workflow from the examples package’s dependencies and rely on @llamaindex/server to provide it. This ensures a single instance of @llama-flow/core@0.4.1 is used in examples and server.

Summary by CodeRabbit

  • Chores
    • Updated dependency versions for improved compatibility and stability.
    • Removed an unused workflow dependency.
  • Refactor
    • Adjusted import sources for AI-related classes to enhance clarity and maintainability.
    • Explicitly configured the language model used in workflows and server setups.

@thucpn thucpn requested a review from marcusschiesser June 4, 2025 11:17
@changeset-bot
Copy link

changeset-bot bot commented Jun 4, 2025

⚠️ No Changeset found

Latest commit: c6bedf0

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@coderabbitai
Copy link

coderabbitai bot commented Jun 4, 2025

Walkthrough

The changes update import statements to source OpenAI and OpenAIEmbedding from @llamaindex/openai instead of llamaindex, adjust dependency versions in package.json to newer, more restrictive versions, and remove the @llamaindex/workflow dependency. Additionally, the global LLM setting is explicitly configured to use the "gpt-4o-mini" model in multiple example workflows.

Changes

File(s) Change Summary
.../examples/agentic-rag/index.ts Refactored imports to source OpenAI and OpenAIEmbedding from @llamaindex/openai instead of llamaindex.
.../examples/package.json Updated dependency versions to newer, more restrictive ranges; removed @llamaindex/workflow dependency.
.../examples/simple-workflow/calculator.ts Imported OpenAI from @llamaindex/openai; explicitly set Settings.llm to use the "gpt-4o-mini" model.
.../examples/custom-layout/index.ts Imported OpenAI from @llamaindex/openai; explicitly set Settings.llm to use the "gpt-4o-mini" model.
.../examples/devmode/index.ts Imported OpenAI from @llamaindex/openai; explicitly set Settings.llm to use the "gpt-4o-mini" model.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant ExampleWorkflow
    participant OpenAI (from @llamaindex/openai)

    User->>ExampleWorkflow: Trigger workflow operation
    ExampleWorkflow->>OpenAI (from @llamaindex/openai): Send LLM request (using gpt-4o-mini)
    OpenAI (from @llamaindex/openai)-->>ExampleWorkflow: Return LLM response
    ExampleWorkflow-->>User: Return processed result
Loading

Possibly related PRs

Poem

A hop and a skip, the imports align,
With OpenAI’s source now clearly defined.
Dependencies pruned, versions anew,
The calculator’s smarter, its logic rings true.
In the warren of code, all neat and precise—
This bunny approves, the changes are nice! 🐇✨

✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@thucpn thucpn changed the title fix: conflict package versions in ts examples [WIP] fix: conflict package versions in ts examples Jun 4, 2025
@thucpn thucpn marked this pull request as ready for review June 4, 2025 16:48
@thucpn thucpn changed the title [WIP] fix: conflict package versions in ts examples fix: conflict package versions in ts examples Jun 4, 2025
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 13a967b and 0aab544.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (3)
  • packages/server/examples/agentic-rag/index.ts (1 hunks)
  • packages/server/examples/package.json (1 hunks)
  • packages/server/examples/simple-workflow/calculator.ts (1 hunks)
🔇 Additional comments (5)
packages/server/examples/simple-workflow/calculator.ts (3)

1-1: LGTM: Import refactoring aligns with package restructuring.

Moving the OpenAI import to @llamaindex/openai is consistent with the dependency updates and package restructuring.


4-4: LGTM: Clean separation of imports.

Good separation keeping Settings and tool in the main llamaindex package while moving OpenAI-specific imports to the dedicated package.


7-9: LGTM: Explicit LLM configuration improves clarity.

Explicitly setting the LLM model to "gpt-4o-mini" is a good practice that makes the configuration clear and removes ambiguity about which model is being used.

packages/server/examples/agentic-rag/index.ts (2)

1-1: LGTM: Consistent import refactoring for OpenAI components.

Moving OpenAI and OpenAIEmbedding imports to @llamaindex/openai is consistent with the package restructuring seen across the examples.


4-4: LGTM: Proper separation of core LlamaIndex imports.

Keeping core LlamaIndex functionality (Document, Settings, VectorStoreIndex) in the main package while separating OpenAI-specific components is a clean architectural choice.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (2)
packages/server/examples/custom-layout/index.ts (1)

7-10: Consider making the model configurable.

The explicit LLM configuration is good for ensuring consistent behavior. However, the hardcoded model name could be made more flexible.

Consider using an environment variable for the model:

 Settings.llm = new OpenAI({
-  model: "gpt-4o-mini",
+  model: process.env.OPENAI_MODEL || "gpt-4o-mini",
 });
packages/server/examples/devmode/index.ts (1)

6-9: Explicit LLM configuration improves consistency.

The explicit configuration ensures the example uses the intended model. The pattern is consistent with other example files in this PR.

For better maintainability, consider extracting the model configuration to a shared constant or environment variable, especially since this pattern is repeated across multiple example files.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 0aab544 and c6bedf0.

📒 Files selected for processing (2)
  • packages/server/examples/custom-layout/index.ts (1 hunks)
  • packages/server/examples/devmode/index.ts (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (56)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
  • GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
  • GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
  • GitHub Check: Unit Tests (windows-latest, 3.9)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
  • GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
  • GitHub Check: lint
🔇 Additional comments (4)
packages/server/examples/custom-layout/index.ts (2)

1-1: LGTM! Import restructuring aligns with dependency conflict resolution.

Moving the OpenAI import from llamaindex to @llamaindex/openai helps resolve the dependency conflicts mentioned in the PR objectives.


4-4: Good addition of Settings import.

Adding Settings to the imports enables explicit LLM configuration, which improves control over the model behavior.

packages/server/examples/devmode/index.ts (2)

1-1: LGTM! Consistent dependency restructuring.

The import change aligns with the dependency conflict resolution strategy and maintains consistency across example files.


3-3: Good addition for explicit configuration.

Adding the Settings import enables explicit LLM configuration, improving predictability of the example behavior.

@marcusschiesser marcusschiesser merged commit 63edd74 into main Jun 5, 2025
60 checks passed
@marcusschiesser marcusschiesser deleted the tp/fix-conflict-package-versions-in-ts-examples branch June 5, 2025 02:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants