-
Notifications
You must be signed in to change notification settings - Fork 58
Update getting_started.md #432
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Updating llama-stack version
WalkthroughUpdates docs/getting_started.md to bump llama-stack install version from 0.2.16 to 0.2.17 and fixes the final code block’s closing fence/newline. Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~2 minutes Possibly related PRs
Suggested reviewers
Poem
Tip 🔌 Remote MCP (Model Context Protocol) integration is now available!Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats. ✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
tisnik
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thank you
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
docs/getting_started.md (1)
12-15: Confirm Python version compatibility before listing 3.13Based on a quick PyPI check (see your script output), torch 2.8.0, faiss-cpu 1.9.0.post1, and triton 3.4.0 all publish cp313 wheels, whereas peft, datasets, litellm, mcp, and opentelemetry-sdk do not yet provide Python 3.13 wheels. To avoid early-release pitfalls, please update docs/getting_started.md as follows:
• File: docs/getting_started.md, lines 12–15
• Change the Python line to explicitly recommend 3.12 and call out 3.13 support status. For example:- 1. Python 3.12 or 3.13 + 1. Python 3.12 (recommended for full compatibility)• Add a note below, e.g.:
> Note: Python 3.13 wheels are currently available for torch, faiss-cpu, and triton only; other dependencies may require source builds.Let me know if you’d like help drafting a full compatibility matrix.
🧹 Nitpick comments (6)
docs/getting_started.md (6)
18-20: Prefer the official uv installer or pipx to avoid PATH issues; add cross-platform package notes.
pip install --user uvoften leaves theuvbinary off PATH on many systems. Recommend the upstream installer and alternatives, and broaden OS guidance for curl/jq.Apply:
-1. `pip install --user uv` -1. `sudo dnf install curl jq` +1. Install uv (recommended): + - Linux/macOS: `curl -LsSf https://astral.sh/uv/install.sh | sh` + - Or with pipx: `pipx install uv` +1. Install curl and jq: + - Fedora/RHEL: `sudo dnf install curl jq` + - Debian/Ubuntu: `sudo apt-get update && sudo apt-get install -y curl jq` + - macOS (Homebrew): `brew install curl jq`
23-25: Make the “Clone LCS repository” step actionable (clone + cd).As written, it’s easy to miss changing directories before running
uv add.Apply:
-1. Clone LCS repository -1. Add and install all required dependencies +1. Clone LCS repository and enter it + ```bash + git clone https://github.com/lightspeed-core/lightspeed-stack.git + cd lightspeed-stack + ``` +1. Add and install all required dependencies
26-45: Lock and sync for reproducibility after adding dependencies.Without a lock step, fresh environments may resolve different versions than your example output.
Apply:
"trl>=0.18.2" ``` +1. Resolve and install the lockfile for reproducibility + ```bash + uv lock && uv sync --frozen + ```
47-112: The long “Installed packages” transcript is highly volatile—trim or annotate to prevent drift.This list will rot quickly and confuse readers when counts/versions differ by platform/time.
Apply:
- ```text - Resolved 195 packages in 1.19s - Built lightspeed-stack @ file:///tmp/ramdisk/lightspeed-stack - Prepared 12 packages in 1.72s - Installed 60 packages in 4.47s - + accelerate==1.9.0 - + autoevals==0.0.129 - + blobfile==3.0.0 - ... - + trl==0.20.0 - + wrapt==1.17.2 - + xxhash==3.5.0 - ``` + ```text + Installed packages (abbreviated): + + llama-stack==0.2.17 + + lightspeed-stack==<local path> + + torch==2.7.1 + + fastapi==… + + opentelemetry-sdk==… + … (output varies by platform and time) + ```
229-235: Clarify server.port=8321 in run.yaml when using “library mode.”The intro claims “only one port (e.g., 8080) will be accessible,” but this run.yaml also sets a Llama Stack server port (8321). In library mode, that server block is typically unused; consider commenting this or adding a note to avoid confusion.
Proposed tweak:
server: - auth: null - host: null - port: 8321 + # In library mode this section is not used; LCS serves on 8080. + auth: null + host: null + port: 8321
26-45: Optional: consider pinning a few historically fragile deps.Torch, Triton, FAISS, and TRL often have platform/ABI constraints. Pinning to known-good versions in pyproject and relying on
uv sync --frozenwill reduce “works on my machine” drift.I can open a follow-up PR to move these from doc commands into pyproject dependencies with explicit pins and platform markers.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (1)
docs/getting_started.md(2 hunks)
🔇 Additional comments (3)
docs/getting_started.md (3)
327-341: Fixed closing code fence—nice catch.The final JSON example now renders correctly.
274-281: Run target exists in Makefile – no fallback neededThe Makefile already defines a
runtarget (see line 8:run: ## Run the service locally), so the additional fallback command is unnecessary.
26-29: llama-stack 0.2.17 confirmed and repo-wide pins updated
Version 0.2.17 exists on PyPI, and there are no remaining references to 0.2.16. All llama-stack pins (and llama-stack-client) in the project have been bumped to 0.2.17, so this change is ready to merge.• pyproject.toml:
– "llama-stack==0.2.17"
– "llama-stack-client==0.2.17"
• examples/pyproject.llamastack.toml:
– "llama-stack==0.2.17"
• docs/getting_started.md:
– uv add "llama-stack==0.2.17"
Updating llama-stack version
Description
Getting started doc out of sync with project dependencies; results in errors when installing 0.2.16.
https://docs.google.com/document/d/171PHjne9r0yWvEAV8pBnPJQUt7ouHYF-43eP-eGZgQk/edit?tab=t.0
Type of change
Related Tickets & Documents
Checklist before requesting a review
Testing
Summary by CodeRabbit