Skip to content

Commit 4f979ce

Browse files
release: 0.1.0-alpha.9 (#15)
* feat(api): manual updates add streaming to chat completions * release: 0.1.0-alpha.9 --------- Co-authored-by: stainless-app[bot] <142633134+stainless-app[bot]@users.noreply.github.com>
1 parent ecca60c commit 4f979ce

File tree

14 files changed

+851
-41
lines changed

14 files changed

+851
-41
lines changed

.release-please-manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
".": "0.1.0-alpha.8"
2+
".": "0.1.0-alpha.9"
33
}

.stats.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
configured_endpoints: 76
22
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/digitalocean%2Fgradientai-e8b3cbc80e18e4f7f277010349f25e1319156704f359911dc464cc21a0d077a6.yml
33
openapi_spec_hash: c773d792724f5647ae25a5ae4ccec208
4-
config_hash: e1b3d85ba9ae21d729a914c789422ba7
4+
config_hash: 0bc3af28d4abd9be8bcc81f615bc832d

CHANGELOG.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,13 @@
11
# Changelog
22

3+
## 0.1.0-alpha.9 (2025-06-28)
4+
5+
Full Changelog: [v0.1.0-alpha.8...v0.1.0-alpha.9](https://github.com/digitalocean/gradientai-python/compare/v0.1.0-alpha.8...v0.1.0-alpha.9)
6+
7+
### Features
8+
9+
* **api:** manual updates ([e0c210a](https://github.com/digitalocean/gradientai-python/commit/e0c210a0ffde24bd2c5877689f8ab222288cc597))
10+
311
## 0.1.0-alpha.8 (2025-06-27)
412

513
Full Changelog: [v0.1.0-alpha.7...v0.1.0-alpha.8](https://github.com/digitalocean/gradientai-python/compare/v0.1.0-alpha.7...v0.1.0-alpha.8)

api.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -65,7 +65,7 @@ Methods:
6565
Types:
6666

6767
```python
68-
from gradientai.types.agents.chat import ChatCompletionChunk, CompletionCreateResponse
68+
from gradientai.types.agents.chat import AgentChatCompletionChunk, CompletionCreateResponse
6969
```
7070

7171
Methods:
@@ -396,7 +396,7 @@ Methods:
396396
Types:
397397

398398
```python
399-
from gradientai.types.chat import CompletionCreateResponse
399+
from gradientai.types.chat import ChatCompletionChunk, CompletionCreateResponse
400400
```
401401

402402
Methods:

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "c63a5cfe-b235-4fbe-8bbb-82a9e02a482a-python"
3-
version = "0.1.0-alpha.8"
3+
version = "0.1.0-alpha.9"
44
description = "The official Python library for GradientAI"
55
dynamic = ["readme"]
66
license = "Apache-2.0"

src/gradientai/_version.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.
22

33
__title__ = "gradientai"
4-
__version__ = "0.1.0-alpha.8" # x-release-please-version
4+
__version__ = "0.1.0-alpha.9" # x-release-please-version

src/gradientai/resources/agents/chat/completions.py

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -20,8 +20,8 @@
2020
from ...._streaming import Stream, AsyncStream
2121
from ...._base_client import make_request_options
2222
from ....types.agents.chat import completion_create_params
23-
from ....types.agents.chat.chat_completion_chunk import ChatCompletionChunk
2423
from ....types.agents.chat.completion_create_response import CompletionCreateResponse
24+
from ....types.agents.chat.agent_chat_completion_chunk import AgentChatCompletionChunk
2525

2626
__all__ = ["CompletionsResource", "AsyncCompletionsResource"]
2727

@@ -186,7 +186,7 @@ def create(
186186
extra_query: Query | None = None,
187187
extra_body: Body | None = None,
188188
timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
189-
) -> Stream[ChatCompletionChunk]:
189+
) -> Stream[AgentChatCompletionChunk]:
190190
"""
191191
Creates a model response for the given chat conversation.
192192
@@ -299,7 +299,7 @@ def create(
299299
extra_query: Query | None = None,
300300
extra_body: Body | None = None,
301301
timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
302-
) -> CompletionCreateResponse | Stream[ChatCompletionChunk]:
302+
) -> CompletionCreateResponse | Stream[AgentChatCompletionChunk]:
303303
"""
304304
Creates a model response for the given chat conversation.
305305
@@ -412,7 +412,7 @@ def create(
412412
extra_query: Query | None = None,
413413
extra_body: Body | None = None,
414414
timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
415-
) -> CompletionCreateResponse | Stream[ChatCompletionChunk]:
415+
) -> CompletionCreateResponse | Stream[AgentChatCompletionChunk]:
416416
return self._post(
417417
"/chat/completions"
418418
if self._client._base_url_overridden
@@ -446,7 +446,7 @@ def create(
446446
),
447447
cast_to=CompletionCreateResponse,
448448
stream=stream or False,
449-
stream_cls=Stream[ChatCompletionChunk],
449+
stream_cls=Stream[AgentChatCompletionChunk],
450450
)
451451

452452

@@ -610,7 +610,7 @@ async def create(
610610
extra_query: Query | None = None,
611611
extra_body: Body | None = None,
612612
timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
613-
) -> AsyncStream[ChatCompletionChunk]:
613+
) -> AsyncStream[AgentChatCompletionChunk]:
614614
"""
615615
Creates a model response for the given chat conversation.
616616
@@ -723,7 +723,7 @@ async def create(
723723
extra_query: Query | None = None,
724724
extra_body: Body | None = None,
725725
timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
726-
) -> CompletionCreateResponse | AsyncStream[ChatCompletionChunk]:
726+
) -> CompletionCreateResponse | AsyncStream[AgentChatCompletionChunk]:
727727
"""
728728
Creates a model response for the given chat conversation.
729729
@@ -836,7 +836,7 @@ async def create(
836836
extra_query: Query | None = None,
837837
extra_body: Body | None = None,
838838
timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
839-
) -> CompletionCreateResponse | AsyncStream[ChatCompletionChunk]:
839+
) -> CompletionCreateResponse | AsyncStream[AgentChatCompletionChunk]:
840840
return await self._post(
841841
"/chat/completions"
842842
if self._client._base_url_overridden
@@ -870,7 +870,7 @@ async def create(
870870
),
871871
cast_to=CompletionCreateResponse,
872872
stream=stream or False,
873-
stream_cls=AsyncStream[ChatCompletionChunk],
873+
stream_cls=AsyncStream[AgentChatCompletionChunk],
874874
)
875875

876876

0 commit comments

Comments
 (0)