-
Notifications
You must be signed in to change notification settings - Fork 69
LCORE-1026: Updated deps #905
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughUpdated dependency pins across two platform-specific requirement files: FastAPI (0.124.0 → 0.124.2), Matplotlib (3.10.7 → 3.10.8), and SQLAlchemy (2.0.45) with corresponding new SHA256 hashes for package verification. Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~5 minutes
Possibly related PRs
Pre-merge checks and finishing touches❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
requirements.x86_64.txt (1)
1643-1699: Matplotlib 3.10.8 — dependency versions verified across architectures.Version pins confirmed:
- numpy==2.2.6, contourpy==1.3.3, kiwisolver==1.4.9, fonttools==4.61.0
- x86_64 and aarch64 have identical pinned versions (parity maintained)
- matplotlib 3.10.8 satisfies pyproject.toml constraint (>=3.10.0)
- All versions are compatible with numpy 2.x ABI (wheel-dependent; ensure binary built against numpy 2.x)
A quick import/render smoke test in CI would be prudent to catch any numpy binary incompatibilities early.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (2)
requirements.aarch64.txt(4 hunks)requirements.x86_64.txt(4 hunks)
🧰 Additional context used
🧠 Learnings (3)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
requirements.x86_64.txtrequirements.aarch64.txt
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.
Applied to files:
requirements.x86_64.txtrequirements.aarch64.txt
📚 Learning: 2025-08-18T10:56:55.349Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:0-0
Timestamp: 2025-08-18T10:56:55.349Z
Learning: The lightspeed-stack project intentionally uses a "generic image" approach, bundling many dependencies directly in the base runtime image to work for everyone, rather than using lean base images with optional dependency groups.
Applied to files:
requirements.x86_64.txtrequirements.aarch64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
- GitHub Check: build-pr
- GitHub Check: E2E: library mode / ci
- GitHub Check: E2E: server mode / ci
- GitHub Check: E2E: library mode / azure
- GitHub Check: E2E: server mode / azure
🔇 Additional comments (5)
requirements.x86_64.txt (2)
3326-3326: SQLAlchemy 2.0.45 — additional hashes only.No version change; new wheels/hashes added. Both architecture files have been updated identically with 65 hash entries, improving reproducibility. Consider verifying that the hash sets are complete and current for your deployment needs.
659-661: FastAPI 0.124.2 pin is compatible with Starlette 0.50.0 and Pydantic 2.12.5 — approved.The version pinning is correct. FastAPI 0.124.2 requires Starlette >=0.40.0,<0.51.0 (0.50.0 is within range) and Pydantic v2.x (2.12.5 is compatible). Cross-architecture consistency verified (x86_64 and aarch64 both pin fastapi==0.124.2). As noted in learnings, faiss-cpu and psycopg2-binary remain top-level pinned as intended.
requirements.aarch64.txt (3)
659-661: FastAPI 0.124.2 is compatible with pinned Starlette and Pydantic versions.FastAPI 0.124.2 requires
starlette<0.51.0,>=0.40.0andpydantic!=1.8,!=1.8.1,!=2.0.0,!=2.0.1,!=2.1.0,<3.0.0,>=1.7.4. The pinned versionsstarlette==0.50.0andpydantic==2.12.5satisfy both constraints and have no known compatibility issues on Python 3.12.
3326-3326: SQLAlchemy version and hashes verified across architectures.SQLAlchemy remains at 2.0.45 with consistent hashes across both aarch64 and x86_64 requirement files. Greenlet==3.3.0 is present in both files, confirming compatibility. Lines 3326, 3333, 3352, and 3354 contain additional SHA256 hash entries for the same SQLAlchemy package, which is a standard pip requirement format. No action needed.
1643-1698: Matplotlib 3.10.8: aarch64 wheels and NumPy/stack compatibility confirmed.The cp312 aarch64 manylinux wheel exists (
matplotlib-3.10.8-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl). All pinned dependencies satisfy 3.10.8's requirements: numpy>=1.23 (pin: 2.2.6 ✓), kiwisolver>=1.3.1 (pin: 1.4.9 ✓), contourpy>=1.0.1 (pin: 1.3.3 ✓), pillow>=8 (pin: 12.0.0 ✓).
Description
LCORE-1026: Updated deps
Type of change
Tools used to create PR
Identify any AI code assistants used in this PR (for transparency and review context)
Related Tickets & Documents
Summary by CodeRabbit
Release Notes
✏️ Tip: You can customize this high-level summary in your review settings.