Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -245,7 +245,7 @@ version = "0.1.0"
description = "Llama Stack runner"
authors = []
dependencies = [
"llama-stack==0.2.20",
"llama-stack==0.2.21",
"fastapi>=0.115.12",
"opentelemetry-sdk>=1.34.0",
"opentelemetry-exporter-otlp>=1.34.0",
Expand Down
2 changes: 1 addition & 1 deletion docs/deployment_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -390,7 +390,7 @@ cp examples/run.yaml /tmp/llama-stack-server
The output should be in this form:
```json
{
"version": "0.2.20"
"version": "0.2.21"
}
```
Comment on lines +393 to 395
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Update remaining version example for consistency

Nice catch updating this sample to 0.2.21. There’s still another curl ... /v1/version output later in the guide (Line 1074) that shows 0.2.18, so readers get conflicting signals. Please bump that snippet to 0.2.21 as well.

🤖 Prompt for AI Agents
In docs/deployment_guide.md around lines 393-395 and again at ~1074, the example
curl /v1/version output is inconsistent (one shows "0.2.21" while the later
snippet still shows "0.2.18"); update the later snippet to match by changing the
version string to "0.2.21" so both examples are consistent; ensure the JSON
formatting and surrounding backticks remain unchanged.


Expand Down
4 changes: 2 additions & 2 deletions docs/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ It is possible to run Lightspeed Core Stack service with Llama Stack "embedded"
1. Add and install all required dependencies
```bash
uv add \
"llama-stack==0.2.20" \
"llama-stack==0.2.21" \
"fastapi>=0.115.12" \
"opentelemetry-sdk>=1.34.0" \
"opentelemetry-exporter-otlp>=1.34.0" \
Expand Down Expand Up @@ -383,4 +383,4 @@ curl -X POST "http://localhost:8080/v1/query" \
```

#### Step 4: Verify connectivity
After starting the MCP servers and updating `lightspeed-stack.yaml`, test by sending a prompt to the AI agent. LCS evaluates the prompt against available tools’ metadata, selects the appropriate tool, calls the corresponding MCP server, and uses the result to generate more accurate agent response.
After starting the MCP servers and updating `lightspeed-stack.yaml`, test by sending a prompt to the AI agent. LCS evaluates the prompt against available tools’ metadata, selects the appropriate tool, calls the corresponding MCP server, and uses the result to generate more accurate agent response.
4 changes: 2 additions & 2 deletions docs/openapi.json
Original file line number Diff line number Diff line change
Expand Up @@ -2153,7 +2153,7 @@
"llama_stack_version"
],
"title": "InfoResponse",
"description": "Model representing a response to an info request.\n\nAttributes:\n name: Service name.\n service_version: Service version.\n llama_stack_version: Llama Stack version.\n\nExample:\n ```python\n info_response = InfoResponse(\n name=\"Lightspeed Stack\",\n service_version=\"1.0.0\",\n llama_stack_version=\"0.2.20\",\n )\n ```",
"description": "Model representing a response to an info request.\n\nAttributes:\n name: Service name.\n service_version: Service version.\n llama_stack_version: Llama Stack version.\n\nExample:\n ```python\n info_response = InfoResponse(\n name=\"Lightspeed Stack\",\n service_version=\"1.0.0\",\n llama_stack_version=\"0.2.21\",\n )\n ```",
"examples": [
{
"llama_stack_version": "1.0.0",
Expand Down Expand Up @@ -3159,4 +3159,4 @@
}
}
}
}
}
2 changes: 1 addition & 1 deletion docs/openapi.md
Original file line number Diff line number Diff line change
Expand Up @@ -1011,7 +1011,7 @@ Example:
info_response = InfoResponse(
name="Lightspeed Stack",
service_version="1.0.0",
llama_stack_version="0.2.20",
llama_stack_version="0.2.21",
)
```

Expand Down
2 changes: 1 addition & 1 deletion docs/output.md
Original file line number Diff line number Diff line change
Expand Up @@ -1011,7 +1011,7 @@ Example:
info_response = InfoResponse(
name="Lightspeed Stack",
service_version="1.0.0",
llama_stack_version="0.2.20",
llama_stack_version="0.2.21",
)
```

Expand Down
2 changes: 1 addition & 1 deletion examples/pyproject.llamastack.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ version = "0.1.0"
description = "Default template for PDM package"
authors = []
dependencies = [
"llama-stack==0.2.20",
"llama-stack==0.2.21",
"fastapi>=0.115.12",
"opentelemetry-sdk>=1.34.0",
"opentelemetry-exporter-otlp>=1.34.0",
Expand Down
2 changes: 1 addition & 1 deletion src/models/responses.py
Original file line number Diff line number Diff line change
Expand Up @@ -164,7 +164,7 @@ class InfoResponse(BaseModel):
info_response = InfoResponse(
name="Lightspeed Stack",
service_version="1.0.0",
llama_stack_version="0.2.20",
llama_stack_version="0.2.21",
)
```
"""
Expand Down
Loading