-
Notifications
You must be signed in to change notification settings - Fork 69
LCORE-1026: Konflux requirements updates #871
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LCORE-1026: Konflux requirements updates #871
Conversation
WalkthroughUpdated multiple Python package dependencies and their cryptographic hashes in both architecture-specific requirements files. Packages updated include faiss-cpu, fastapi, greenlet, litellm, openai, OpenTelemetry components (bumped to 1.39.0), and protobuf-related packages. No functional code or logic changes. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes
Possibly related PRs
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
requirements.aarch64.txt (1)
1977-2030: Critical: opentelemetry-semantic-conventions version mismatchThe review claims OpenTelemetry versions are aligned, but this is incorrect. According to OpenTelemetry's official v1.39.0 release notes,
opentelemetry-semantic-conventionsshould be v1.38.0 (or later compatible version), not 0.60b0. The 0.60b0 designation is reserved for theopentelemetry-instrumentationcontrib package; semantic-conventions follows independent versioning and was bumped to 1.38.0 in the 1.39.0 release cycle. The mismatch between 1.39.0 SDK and 0.60b0 semantic-conventions may cause compatibility issues. Verify the actual semantic-conventions version in the file and update if necessary.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
requirements.aarch64.txt(6 hunks)requirements.x86_64.txt(6 hunks)
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
Applied to files:
requirements.aarch64.txtrequirements.x86_64.txt
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.
Applied to files:
requirements.aarch64.txtrequirements.x86_64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: build-pr
🔇 Additional comments (12)
requirements.aarch64.txt (6)
1-3: Ensure requirements.x86_64.txt mirrors version bumps in requirements.aarch64.txt and verify deterministic hash resolution.Cross-arch parity is critical for reproducible builds. Verify the following packages match across both architecture-specific requirements files: faiss-cpu, fastapi, greenlet, litellm, mcp, openai, protobuf, opentelemetry-api, opentelemetry-sdk, and opentelemetry-exporter-otlp. Additionally, confirm Konflux image builds continue to resolve wheels deterministically with hashes.
Also applies to: 664-667, 1706-1709, 1977-2030, 3583-3596
1969-1971: OpenAI 2.9.0: audit for old API usage.Ensure code uses the v2 client patterns (OpenAI().chat.completions.create, etc.). Search for legacy patterns like
openai.ChatCompletion.create()which would break with this version.
2401-2410: protobuf 6.33.2: watch for libs pinning <6.Scan repo for restrictive protobuf constraints or generated stubs that assume older runtime. Unable to verify this concern due to repository access limitations.
969-1017: The greenlet 3.3.0 wheels for Linux aarch64 are available on PyPI (manylinux_2_24_aarch64, manylinux_2_28_aarch64, and musllinux_1_2_aarch64). No action needed.Likely an incorrect or invalid review comment.
1398-1401: LiteLLM 1.80.8 is compatible with OpenAI 2.9.0.LiteLLM declares
openai>=1.0.0as a requirement, and v2.9.0 contains only minor features and bugfixes with no breaking changes. No smoke testing is required for this version bump.
1706-1709: mcp version 1.23.1 does not exist on PyPI; use 1.23.0 instead.Web search confirms that the latest available mcp release on PyPI is 1.23.0 (uploaded December 2, 2025). Version 1.23.1 does not appear in the PyPI release history. Update the pin to
mcp==1.23.0and verify the hash corresponds to the actual PyPI package.Likely an incorrect or invalid review comment.
requirements.x86_64.txt (6)
647-663: faiss-cpu 1.13.1: OK; keep it pinned top‑level.This aligns with project practice to pin transitive ML deps for stability. Based on learnings, faiss-cpu is intentionally top‑level pinned.
Confirm Konflux base image can import the wheel:
- Build stage: run
python -c "import faiss,platform;print(faiss.__version__, platform.platform())".
1707-1708: mcp 1.23.1: LGTM.No compatibility flags from this lockfile; proceed.
1970-1971: openai 2.9.0: Verify no legacy v0.x usage patterns remain in codebase.The requirements.x86_64.txt pins openai==2.9.0, but code review must confirm Python source files use the 2.x client (e.g.,
from openai import OpenAI) and do not contain legacy patterns likeopenai.ChatCompletion.createoropenai.Completion.create.
970-1018: Verify greenlet 3.3.0 compatibility with SQLAlchemy 2.0.44 if async engine is used.If the codebase uses
create_async_engine, review greenlet's async context handling to ensure compatibility under concurrent load.
1399-1401: litellm 1.80.8: minor observation; validate provider calls in CI.The 1.80.8 release adds new providers (Z.AI, RAGFlow, PublicAI, Google Cloud Chirp3 HD) and expands the Responses API to support additional providers. No breaking changes are documented in the release notes. Ensuring existing provider calls are tested in CI remains a sound practice when updating this dependency.
1977-1978: Verify telemetry stack after OTEL + semantic-conventions upgrade.These versions should be tested together (especially if semantic-conventions is a beta release), but confirmation requires checking actual versions in requirements.x86_64.txt and requirements.aarch64.txt for cross-arch parity, as well as verifying whether the application actually initializes or uses OpenTelemetry tracing.
| faiss-cpu==1.13.1 \ | ||
| --hash=sha256:0b2f0e6cd30511b9fe320a2309389269269d3e363cc88c3a0380095a8c08ae27 \ | ||
| --hash=sha256:0fece5b63e8d014f8db4abfe0b4c9a82e6508e64f450fce700e5cb4b47041f1a \ | ||
| --hash=sha256:168986e3f152a7568257c5ac50f3cf1a1aaa34fb41e1ba7259799bcb8ffe687f \ | ||
| --hash=sha256:24cb2d6ce2459c94e15a6cecfed15ff8d9f997aed7bae4037c0f045022030cb3 \ | ||
| --hash=sha256:2967def7aa2da8efbf6a5da81138780aa17a9970ca666417cb632a00a593423d \ | ||
| --hash=sha256:30c179891656a988f5223e586c696432aacc5f4e763d85e165be30ef57ac2bbf \ | ||
| --hash=sha256:30e9c129e0beb9da699982d9068322e3808e0d0faab4652cabf2c08900ab7892 \ | ||
| --hash=sha256:3de25edb0e69c1b95eeda923b2e23da01f472b2cc3f4817e63b25a56847d6ea7 \ | ||
| --hash=sha256:5195ab9149c563cafe4da8ab4cc0b84b177cbb1f8aa897a8c199e11ef4f37e16 \ | ||
| --hash=sha256:5f71c8840794c39c1e1cdd92c2ef4d3f77b3e650f614f296e31c2545ad2bab51 \ | ||
| --hash=sha256:8ad542573ad05af6c508f4cf5268ba2aad06f0c8d4e780a0eeba7fe6fd274922 \ | ||
| --hash=sha256:9860949a70d1b25ff11ac9600aeda796db6710bf667b1f5508b44a63e6170b30 \ | ||
| --hash=sha256:dbcf54daf14068b23c3d4116aa087eff6cf7cb43457c572318dfc2cbd944607a \ | ||
| --hash=sha256:ff5bdbf392081659e6b0f98f03b602bf08d1b5a790e28aa1185ae925decff6b2 \ | ||
| --hash=sha256:ffc58173e24026ee4dc08c50dd3506ad553d4b2103892500b0d4ae9344027d57 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check PyPI for faiss-cpu 1.13.1 wheel availability
curl -s https://pypi.org/pypi/faiss-cpu/1.13.1/json | jq -r '.releases["1.13.1"][] | .filename' | head -20Repository: lightspeed-core/lightspeed-stack
Length of output: 208
🏁 Script executed:
# List all files in repository to understand structure
git ls-files requirements* | head -20Repository: lightspeed-core/lightspeed-stack
Length of output: 208
🏁 Script executed:
# Check if there's an x86_64 requirements file for comparison
cat requirements.x86_64.txt 2>/dev/null | grep -A 15 "faiss-cpu==" | head -20Repository: lightspeed-core/lightspeed-stack
Length of output: 208
🌐 Web query:
faiss-cpu 1.13.1 PyPI wheels aarch64 python3.12
💡 Result:
Short answer: There is no faiss‑cpu 1.13.1 release on PyPI (no 1.13.1 wheels). The published faiss‑cpu wheels on PyPI are for 1.13.0 and include manylinux aarch64 wheels (a cp39‑abi3 manylinux_2_27_aarch64 / manylinux_2_28_aarch64 wheel) which are intended to be Python‑3.9+ ABI3 compatible and therefore should work on Python 3.12 on aarch64. [1][2]
Sources:
- faiss‑cpu files on PyPI (shows 1.13.0 wheels including manylinux aarch64 cp39‑abi3). [1]
- faiss‑wheels (GitHub) repository / packaging details. [2]
faiss-cpu 1.13.1 does not exist on PyPI; use 1.13.0 instead.
faiss-cpu version 1.13.1 is not published on PyPI. Only version 1.13.0 is available, which includes aarch64 wheels (cp39-abi3 manylinux_2_27/2_28_aarch64) compatible with Python 3.12. Update the pin to faiss-cpu==1.13.0 and regenerate hashes accordingly.
🤖 Prompt for AI Agents
In requirements.aarch64.txt around lines 647 to 662, the pinned package
faiss-cpu==1.13.1 does not exist on PyPI; change the pin to faiss-cpu==1.13.0
and regenerate the corresponding pip hash lines (replace the current --hash
entries with the hashes produced for 1.13.0) so the requirements file references
the published aarch64 wheel for Python 3.12.
| fastapi==0.124.0 \ | ||
| --hash=sha256:260cd178ad75e6d259991f2fd9b0fee924b224850079df576a3ba604ce58f4e6 \ | ||
| --hash=sha256:91596bdc6dde303c318f06e8d2bc75eafb341fc793a0c9c92c0bc1db1ac52480 | ||
| # via |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🌐 Web query:
FastAPI 0.124.0 Starlette requirement version
💡 Result:
FastAPI 0.124.0 requires Starlette >=0.40.0,<0.50.0. [1][2]
Sources:
- FastAPI release notes (shows Starlette bump to allow up to <0.50.0). [1]
- Release summary / changelog noting the Starlette range bump. [2]
FastAPI 0.124.0 is incompatible with Starlette 0.50.0.
FastAPI 0.124.0 requires Starlette >=0.40.0,<0.50.0, but the requirements pin Starlette 0.50.0. This version mismatch will cause runtime failures. Downgrade Starlette to a version below 0.50.0 (e.g., 0.49.x).
🤖 Prompt for AI Agents
In requirements.aarch64.txt around lines 664 to 667, FastAPI 0.124.0 is pinned
but Starlette is currently 0.50.0 which is incompatible; change the Starlette
pin to a compatible <0.50.0 release (for example 0.49.x) and update the
corresponding hash entries for that package in the requirements file so the
resolver installs a Starlette version within FastAPI's supported range.
| fastapi==0.124.0 \ | ||
| --hash=sha256:260cd178ad75e6d259991f2fd9b0fee924b224850079df576a3ba604ce58f4e6 \ | ||
| --hash=sha256:91596bdc6dde303c318f06e8d2bc75eafb341fc793a0c9c92c0bc1db1ac52480 | ||
| # via |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
rg -n '^(fastapi|starlette|pydantic)==' requirements.x86_64.txtRepository: lightspeed-core/lightspeed-stack
Length of output: 208
🌐 Web query:
FastAPI 0.124.0 starlette compatibility requirements
💡 Result:
FastAPI 0.124.0 requires Starlette but does not leave it unbounded — check the package metadata (pyproject/pyPI) for the exact pinned range for that FastAPI release. See FastAPI 0.124.0 on PyPI and the FastAPI repo/discussions about Starlette pinning for details. [1][2]
References
- fastapi 0.124.0 — PyPI (release metadata). [1]
- FastAPI discussion / history about pinning Starlette versions. [2]
🌐 Web query:
FastAPI 0.124.0 PyPI starlette version requirements exact
💡 Result:
FastAPI 0.124.0 requires Starlette exactly with the constraint:
starlette>=0.27.0,<0.28.0. [1][2]
Sources:
- fastapi pyproject.toml (dependency list). [GitHub]. [1]
- FastAPI 0.124.0 metadata on PyPI. [2]
FastAPI 0.124.0 requires starlette>=0.27.0,<0.28.0 — not compatible with starlette 0.50.0.
The compatibility claim in the original comment is incorrect. FastAPI 0.124.0 pins starlette to the 0.27.x series; starlette 0.50.0 will break this upgrade. Verify the actual starlette version specified in requirements.x86_64.txt and confirm whether fastapi 0.124.0 is the intended target or if starlette should be pinned to 0.27.x.
🤖 Prompt for AI Agents
In requirements.x86_64.txt around lines 664-667, FastAPI 0.124.0 is declared but
that release requires starlette>=0.27.0,<0.28.0, so it conflicts with any pinned
starlette 0.50.0; resolve by choosing one of two fixes: either pin starlette to
a compatible 0.27.x release and update the corresponding --hash entries
(regenerate the lock/hashes), or upgrade FastAPI to a version compatible with
starlette 0.50.x and regenerate hashes; also search the file for any other
starlette/FastAPI pins and run the dependency resolver (pip-compile or
pip-compile-multi) to produce consistent versions and updated hashes before
committing.
Description
LCORE-1026: Konflux requirements updates
Type of change
Tools used to create PR
Identify any AI code assistants used in this PR (for transparency and review context)
Related Tickets & Documents
Summary by CodeRabbit
✏️ Tip: You can customize this high-level summary in your review settings.