Skip to content

Conversation

@tisnik
Copy link
Contributor

@tisnik tisnik commented Dec 21, 2025

Description

LCORE-1062: Dependencies update

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up service version
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change
  • Unit tests improvement
  • Integration tests improvement
  • End to end tests improvement

Tools used to create PR

Identify any AI code assistants used in this PR (for transparency and review context)

  • Assisted-by: N/A
  • Generated by: N/A

Related Tickets & Documents

  • Related Issue #LCORE-1062

Summary by CodeRabbit

  • Chores
    • Updated core dependencies across all supported architectures to the latest stable versions for improved stability and performance.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Dec 21, 2025

Walkthrough

This pull request updates dependency versions and corresponding SHA256 hashes in two architecture-specific lock files: requirements.aarch64.txt and requirements.x86_64.txt. Packages including datasets, fastapi, mcp, openai, pythainlp, python-multipart, and trl are bumped to newer versions.

Changes

Cohort / File(s) Change Summary
Architecture-specific lock files
requirements.aarch64.txt, requirements.x86_64.txt
Updated package versions (datasets 4.4.1→4.4.2, fastapi 0.124.4→0.126.0, mcp 1.24.0→1.25.0, openai 2.12.0→2.14.0, pythainlp 5.1.2→5.2.0, python-multipart 0.0.20→0.0.21, trl 0.26.1→0.26.2) with refreshed SHA256 hashes; added dependency group markers for pythainlp

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~5 minutes

  • Verify that version bumps align with intended upgrades for each package
  • Confirm hash values correspond to the correct release versions on PyPI or equivalent registries
  • Check that no conflicting version constraints are introduced across the two architecture files

Possibly related PRs

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'LCORE-1062: Dependencies update' directly corresponds to the main change in the PR, which updates multiple package versions across both aarch64 and x86_64 requirement files.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

📜 Recent review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 96888ca and fa52d06.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (2)
  • requirements.aarch64.txt (10 hunks)
  • requirements.x86_64.txt (10 hunks)
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.

Applied to files:

  • requirements.x86_64.txt
  • requirements.aarch64.txt
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.

Applied to files:

  • requirements.x86_64.txt
  • requirements.aarch64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
  • GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
  • GitHub Check: build-pr
  • GitHub Check: E2E: server mode / vertexai
  • GitHub Check: E2E: server mode / azure
  • GitHub Check: E2E: library mode / vertexai
  • GitHub Check: E2E: library mode / azure
  • GitHub Check: E2E: library mode / ci
  • GitHub Check: E2E: server mode / ci
🔇 Additional comments (11)
requirements.x86_64.txt (1)

608-610: Minor version bumps and pythainlp “via” annotations look consistent with uv output

datasets, fastapi, mcp, openai, pythainlp, python-multipart, and trl are all bumped by small versions, and each has a full hash set. The added # pythainlp “via” lines under numpy, pandas, and pydantic-core match its transitive usage and keep the lockfile commentary consistent.

No issues in the lockfile formatting or pinning from what’s shown here; the changes look like a clean regenerate.

Please double‑check:

  • That requirements.aarch64.txt has the same versions for these libs.
  • That CI / integration tests covering FastAPI endpoints, MCP flows, and OpenAI/TRL usage pass with these new versions.
    If you want, I can suggest targeted smoke tests for the key paths (FastAPI startup, an OpenAI call, a TRL training step, and a basic pythainlp usage) to run locally.

Also applies to: 659-661, 1700-1702, 1956-1956, 1966-1968, 2099-2099, 2771-2773, 2789-2791, 2881-2881, 3538-3540

requirements.aarch64.txt (10)

608-610: datasets bump looks consistent with uv lockfile generation

datasets is bumped to 4.4.2 with matching hashes; this aligns with an automated uv pip compile refresh.

Please double‑check:

  • pyproject.toml pins (or ranges) for datasets are compatible with 4.4.2.
  • The requirements.x86_64.txt lockfile was regenerated with the same uv command so both architectures stay in sync.

659-661: fastapi bump – verify runtime compatibility with Starlette/Uvicorn stack

fastapi is bumped to 0.126.0 with new hashes; looks like a clean uv‑driven upgrade.

Given how sensitive FastAPI/Starlette/Uvicorn integration can be, please:

  • Confirm starlette and uvicorn versions remain within FastAPI’s supported range.
  • Run at least the API‑level smoke tests (or the main service test suite) to catch any subtle behavior changes (e.g., routing, response models, lifespan events).

1700-1702: mcp bump – ensure any MCP server/client integration tests still pass

mcp is bumped to 1.25.0 with updated hashes; change is mechanically correct.

Because MCP defines protocol‑level behavior:

  • Re‑run any MCP server/client integration or contract tests.
  • Confirm any pinned MCP-related configs or feature flags don’t rely on deprecated behavior from 1.24.x.

1956-1956: New numpy → pythainlp “via” entry documents transitive usage

Adding # pythainlp under numpy’s “via” section just reflects the refreshed dependency graph and improves traceability.


1966-1968: openai bump – confirm client usage and retry logic stay compatible

openai is bumped to 2.14.0 with correct hash updates; looks like a standard lock refresh.

Given possible API/typing changes between minor versions:

  • Verify pyproject.toml (and any internal SDK wrappers) expect 2.14.x.
  • Run a small subset of flows that exercise OpenAI calls (including streaming and multi‑tool flows if applicable) to ensure no regressions.

2099-2099: New pandas → pythainlp “via” entry is metadata-only

The added # pythainlp under the pandas “via” block is consistent with the refreshed graph and is safe.


2771-2773: pythainlp bump – watch for tokenizer/model subtle changes

pythainlp is bumped to 5.2.0 with updated hashes; the change itself is correct.

Because NLP behavior can shift across minor versions:

  • Re‑run any Thai‑language–specific tests or evaluation suites.
  • If you have baselines for segmentation or classification, compare a small sample to ensure quality hasn’t regressed.

2789-2791: python-multipart bump – verify upload endpoints

python-multipart is bumped to 0.0.21; hashes look consistent with a uv refresh.

Since this underpins file/form uploads for FastAPI/Starlette:

  • Re‑exercise at least one endpoint that handles multipart/form-data (file upload or form submission) to confirm parsing behavior is unchanged.

2881-2881: pydantic “via pythainlp” entry is a benign graph clarification

Adding # pythainlp under pydantic’s “via” list is metadata only and consistent with a regenerated lockfile.


3538-3540: trl bump – validate training/inference pipelines

trl is bumped to 0.26.2 with new hashes; mechanically looks good.

Because TRL changes can affect training loops and PEFT integration:

  • Re‑run at least one representative TRL-based training or inference pipeline (even on a small subset) to confirm configs, schedulers, and logging still behave as expected.
  • Ensure the trl version here matches any explicit pin in pyproject.toml.

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@tisnik tisnik merged commit 1384f97 into lightspeed-core:main Dec 21, 2025
19 of 27 checks passed
@coderabbitai coderabbitai bot mentioned this pull request Dec 23, 2025
15 tasks
@coderabbitai coderabbitai bot mentioned this pull request Dec 31, 2025
15 tasks
@coderabbitai coderabbitai bot mentioned this pull request Jan 14, 2026
18 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant