Skip to content

Commit d47d139

Browse files
authored
Merge pull request #607 from tisnik/lcore-650-update-llama-stack-version-in-doc
LCORE-650: update Llama Stack version in documentation and examples
2 parents fd96246 + 7ed7a6d commit d47d139

File tree

8 files changed

+10
-10
lines changed

8 files changed

+10
-10
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -245,7 +245,7 @@ version = "0.1.0"
245245
description = "Llama Stack runner"
246246
authors = []
247247
dependencies = [
248-
"llama-stack==0.2.20",
248+
"llama-stack==0.2.21",
249249
"fastapi>=0.115.12",
250250
"opentelemetry-sdk>=1.34.0",
251251
"opentelemetry-exporter-otlp>=1.34.0",

docs/deployment_guide.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -390,7 +390,7 @@ cp examples/run.yaml /tmp/llama-stack-server
390390
The output should be in this form:
391391
```json
392392
{
393-
"version": "0.2.20"
393+
"version": "0.2.21"
394394
}
395395
```
396396

docs/getting_started.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ It is possible to run Lightspeed Core Stack service with Llama Stack "embedded"
2424
1. Add and install all required dependencies
2525
```bash
2626
uv add \
27-
"llama-stack==0.2.20" \
27+
"llama-stack==0.2.21" \
2828
"fastapi>=0.115.12" \
2929
"opentelemetry-sdk>=1.34.0" \
3030
"opentelemetry-exporter-otlp>=1.34.0" \
@@ -383,4 +383,4 @@ curl -X POST "http://localhost:8080/v1/query" \
383383
```
384384

385385
#### Step 4: Verify connectivity
386-
After starting the MCP servers and updating `lightspeed-stack.yaml`, test by sending a prompt to the AI agent. LCS evaluates the prompt against available tools’ metadata, selects the appropriate tool, calls the corresponding MCP server, and uses the result to generate more accurate agent response.
386+
After starting the MCP servers and updating `lightspeed-stack.yaml`, test by sending a prompt to the AI agent. LCS evaluates the prompt against available tools’ metadata, selects the appropriate tool, calls the corresponding MCP server, and uses the result to generate more accurate agent response.

docs/openapi.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2153,7 +2153,7 @@
21532153
"llama_stack_version"
21542154
],
21552155
"title": "InfoResponse",
2156-
"description": "Model representing a response to an info request.\n\nAttributes:\n name: Service name.\n service_version: Service version.\n llama_stack_version: Llama Stack version.\n\nExample:\n ```python\n info_response = InfoResponse(\n name=\"Lightspeed Stack\",\n service_version=\"1.0.0\",\n llama_stack_version=\"0.2.20\",\n )\n ```",
2156+
"description": "Model representing a response to an info request.\n\nAttributes:\n name: Service name.\n service_version: Service version.\n llama_stack_version: Llama Stack version.\n\nExample:\n ```python\n info_response = InfoResponse(\n name=\"Lightspeed Stack\",\n service_version=\"1.0.0\",\n llama_stack_version=\"0.2.21\",\n )\n ```",
21572157
"examples": [
21582158
{
21592159
"llama_stack_version": "1.0.0",
@@ -3159,4 +3159,4 @@
31593159
}
31603160
}
31613161
}
3162-
}
3162+
}

docs/openapi.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1011,7 +1011,7 @@ Example:
10111011
info_response = InfoResponse(
10121012
name="Lightspeed Stack",
10131013
service_version="1.0.0",
1014-
llama_stack_version="0.2.20",
1014+
llama_stack_version="0.2.21",
10151015
)
10161016
```
10171017

docs/output.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1011,7 +1011,7 @@ Example:
10111011
info_response = InfoResponse(
10121012
name="Lightspeed Stack",
10131013
service_version="1.0.0",
1014-
llama_stack_version="0.2.20",
1014+
llama_stack_version="0.2.21",
10151015
)
10161016
```
10171017

examples/pyproject.llamastack.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ version = "0.1.0"
44
description = "Default template for PDM package"
55
authors = []
66
dependencies = [
7-
"llama-stack==0.2.20",
7+
"llama-stack==0.2.21",
88
"fastapi>=0.115.12",
99
"opentelemetry-sdk>=1.34.0",
1010
"opentelemetry-exporter-otlp>=1.34.0",

src/models/responses.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -164,7 +164,7 @@ class InfoResponse(BaseModel):
164164
info_response = InfoResponse(
165165
name="Lightspeed Stack",
166166
service_version="1.0.0",
167-
llama_stack_version="0.2.20",
167+
llama_stack_version="0.2.21",
168168
)
169169
```
170170
"""

0 commit comments

Comments
 (0)