Skip to content

Commit f0dc940

Browse files
release: 0.4.0-alpha.2
1 parent efdf1be commit f0dc940

File tree

3 files changed

+16
-2
lines changed

3 files changed

+16
-2
lines changed

.release-please-manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
".": "0.4.0-alpha.1"
2+
".": "0.4.0-alpha.2"
33
}

CHANGELOG.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,19 @@
11
# Changelog
22

3+
## 0.4.0-alpha.2 (2025-11-03)
4+
5+
Full Changelog: [v0.4.0-alpha.1...v0.4.0-alpha.2](https://github.com/llamastack/llama-stack-client-python/compare/v0.4.0-alpha.1...v0.4.0-alpha.2)
6+
7+
### Features
8+
9+
* **api:** point models.list() to /v1/openai/v1/models ([efdf1be](https://github.com/llamastack/llama-stack-client-python/commit/efdf1be41243be5107f4863de99c5dce8504bba9))
10+
11+
12+
### Chores
13+
14+
* bump version to 0.3.2.dev0 ([#292](https://github.com/llamastack/llama-stack-client-python/issues/292)) ([fb91556](https://github.com/llamastack/llama-stack-client-python/commit/fb915569d1b07bbbc1202e3142447807f6d42436))
15+
* **internal/tests:** avoid race condition with implicit client cleanup ([4af8f35](https://github.com/llamastack/llama-stack-client-python/commit/4af8f35cffaf2b3d00a38a8fc5f8ca5a0b266786))
16+
317
## 0.4.0-alpha.1 (2025-10-30)
418

519
Full Changelog: [v0.3.1-alpha.2...v0.4.0-alpha.1](https://github.com/llamastack/llama-stack-client-python/compare/v0.3.1-alpha.2...v0.4.0-alpha.1)

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "llama_stack_client"
3-
version = "0.4.0-alpha.1"
3+
version = "0.4.0-alpha.2"
44
description = "The official Python library for the llama-stack-client API"
55
dynamic = ["readme"]
66
license = "MIT"

0 commit comments

Comments
 (0)