Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(vue/solid/svelte) keepLastMessageOnError #2269

Merged
merged 4 commits into from
Jul 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 0 additions & 5 deletions .changeset/ninety-beers-do.md

This file was deleted.

5 changes: 0 additions & 5 deletions .changeset/odd-avocados-search.md

This file was deleted.

5 changes: 0 additions & 5 deletions .changeset/pretty-elephants-deliver.md

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -93,6 +93,11 @@ The following optional settings are available for Google Generative AI models:
Top-k sampling considers the set of topK most probable tokens.
Models running with nucleus sampling don't allow topK setting.

- **cachedContent** _string_

Optional. The name of the cached content used as context to serve the prediction.
Format: cachedContents/{cachedContent}

- **safetySettings** _Array\<\{ category: string; threshold: string \}\>_

Optional. Safety settings for the model.
Expand Down
34 changes: 29 additions & 5 deletions examples/nuxt-openai/pages/index.vue
Original file line number Diff line number Diff line change
@@ -1,7 +1,18 @@
<script setup lang="ts">
import { useChat } from '@ai-sdk/vue';
import { computed } from 'vue';

const { messages, input, handleSubmit } = useChat();
const {
messages,
input,
handleSubmit,
isLoading,
error,
stop,
} = useChat({
keepLastMessageOnError: true,
});
const disabled = computed(() => isLoading.value || error.value != null)
</script>

<template>
Expand All @@ -11,11 +22,24 @@ const { messages, input, handleSubmit } = useChat();
{{ m.content }}
</div>

<div v-if="isLoading" class="mt-4 text-gray-500">
<div>Loading...</div>
<button type="button" class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md" @click="stop">
Stop
</button>
</div>

<div v-if="error" class="mt-4">
<div class="text-red-500">An error occurred.</div>
<button type="button" class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
@click="handleSubmit">
Retry
</button>
</div>

<form @submit="handleSubmit">
<input
class="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
v-model="input"
placeholder="Say something..."
<input class="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl" v-model="input"
placeholder="Say something..." :disabled="disabled" />
/>
</form>
</div>
Expand Down
39 changes: 38 additions & 1 deletion examples/solidstart-openai/src/routes/index.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,17 @@ import { For } from 'solid-js';
import { useChat } from '@ai-sdk/solid';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
const {
messages,
input,
handleInputChange,
handleSubmit,
isLoading,
error,
stop,
} = useChat({
keepLastMessageOnError: true,
});

return (
<div class="flex flex-col w-full max-w-md py-24 mx-auto stretch">
Expand All @@ -15,12 +25,39 @@ export default function Chat() {
)}
</For>

{isLoading() && (
<div class="mt-4 text-gray-500">
<div>Loading...</div>
<button
type="button"
class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
onClick={stop}
>
Stop
</button>
</div>
)}

{error() && (
<div class="mt-4">
<div class="text-red-500">An error occurred.</div>
<button
type="button"
class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
onClick={handleSubmit}
>
Retry
</button>
</div>
)}

<form onSubmit={handleSubmit}>
<input
class="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
value={input()}
placeholder="Say something..."
onInput={handleInputChange}
disabled={isLoading() || error() != null}
/>
</form>
</div>
Expand Down
79 changes: 54 additions & 25 deletions examples/sveltekit-openai/src/routes/+page.svelte
Original file line number Diff line number Diff line change
@@ -1,37 +1,66 @@
<script lang="ts">
import { useChat } from '@ai-sdk/svelte'
import { useChat } from '@ai-sdk/svelte';

const { input, handleSubmit, messages } = useChat()
const { input, handleSubmit, messages, isLoading, error, stop } = useChat({
keepLastMessageOnError: true,
});
</script>

<svelte:head>
<title>Home</title>
<meta name="description" content="Svelte demo app" />
<title>Home</title>
<meta name="description" content="Svelte demo app" />
</svelte:head>

<section>
<h1>useChat</h1>
<ul>
{#each $messages as message}
<li>{message.role}: {message.content}</li>
{/each}
</ul>
<form on:submit={handleSubmit}>
<input bind:value={$input} />
<button type="submit">Send</button>
</form>
<h1>useChat</h1>
<ul>
{#each $messages as message}
<li>{message.role}: {message.content}</li>
{/each}
</ul>

{#if $isLoading}
<div class="mt-4 text-gray-500">
<div>Loading...</div>
<button
type="button"
class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
on:click={stop}
>
Stop
</button>
</div>
{/if}

{#if $error}
<div class="mt-4">
<div class="text-red-500">An error occurred.</div>
<button
type="button"
class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
on:click={handleSubmit}
>
Retry
</button>
</div>
{/if}

<form on:submit={handleSubmit}>
<input bind:value={$input} disabled={$isLoading || $error != null} />
<button type="submit">Send</button>
</form>
</section>

<style>
section {
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
flex: 0.6;
}

h1 {
width: 100%;
}
section {
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
flex: 0.6;
}

h1 {
width: 100%;
}
</style>
6 changes: 6 additions & 0 deletions packages/google-vertex/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
# @ai-sdk/google-vertex

## 0.0.15

### Patch Changes

- bb584330: feat (provider/google-vertex): use systemInstruction content parts

## 0.0.14

### Patch Changes
Expand Down
2 changes: 1 addition & 1 deletion packages/google-vertex/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@ai-sdk/google-vertex",
"version": "0.0.14",
"version": "0.0.15",
"license": "Apache-2.0",
"sideEffects": false,
"main": "./dist/index.js",
Expand Down
8 changes: 8 additions & 0 deletions packages/google/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
# @ai-sdk/google

## 0.0.27

### Patch Changes

- 2e59d266: feat (provider/google): add cachedContent optional setting
- d2b9723d: feat (provider/google): support system instructions
- 4dfe0b00: feat (provider/google): add tool support for object generation (new default mode)

## 0.0.26

### Patch Changes
Expand Down
2 changes: 1 addition & 1 deletion packages/google/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@ai-sdk/google",
"version": "0.0.26",
"version": "0.0.27",
"license": "Apache-2.0",
"sideEffects": false,
"main": "./dist/index.js",
Expand Down
3 changes: 3 additions & 0 deletions packages/google/src/google-generative-ai-language-model.ts
Original file line number Diff line number Diff line change
Expand Up @@ -110,6 +110,7 @@ export class GoogleGenerativeAILanguageModel implements LanguageModelV1 {
systemInstruction,
safetySettings: this.settings.safetySettings,
...prepareToolsAndToolConfig(mode),
cachedContent: this.settings.cachedContent,
},
warnings,
};
Expand All @@ -125,6 +126,7 @@ export class GoogleGenerativeAILanguageModel implements LanguageModelV1 {
contents,
systemInstruction,
safetySettings: this.settings.safetySettings,
cachedContent: this.settings.cachedContent,
},
warnings,
};
Expand All @@ -146,6 +148,7 @@ export class GoogleGenerativeAILanguageModel implements LanguageModelV1 {
},
toolConfig: { functionCallingConfig: { mode: 'ANY' } },
safetySettings: this.settings.safetySettings,
cachedContent: this.settings.cachedContent,
},
warnings,
};
Expand Down
7 changes: 7 additions & 0 deletions packages/google/src/google-generative-ai-settings.ts
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,13 @@ Models running with nucleus sampling don't allow topK setting.
*/
topK?: number;

/**
Optional.
The name of the cached content used as context to serve the prediction.
Format: cachedContents/{cachedContent}
*/
cachedContent?: string;

/**
Optional. A list of unique safety settings for blocking unsafe content.
*/
Expand Down
15 changes: 13 additions & 2 deletions packages/solid/src/use-chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,7 @@ const getStreamedResponse = async (
onToolCall: UseChatOptions['onToolCall'] | undefined,
sendExtraMessageFields: boolean | undefined,
fetch: FetchFunction | undefined,
keepLastMessageOnError: boolean | undefined,
) => {
// Do an optimistic update to the chat state to show the updated messages
// immediately.
Expand Down Expand Up @@ -139,7 +140,9 @@ const getStreamedResponse = async (
},
abortController: () => abortController,
restoreMessagesOnFailure() {
mutate(previousMessages);
if (!keepLastMessageOnError) {
mutate(previousMessages);
}
},
onResponse,
onUpdate(merged, data) {
Expand Down Expand Up @@ -170,6 +173,13 @@ case of misconfigured tools.
By default, it's set to 0, which will disable the feature.
*/
maxToolRoundtrips?: number;
/**
Keeps the last message when an error happens. This will be the default behavior
starting with the next major release.
The flag was introduced for backwards compatibility.
Please enable it and update your error handling/resubmit behavior.
*/
keepLastMessageOnError?: boolean;
};

export function useChat(
Expand Down Expand Up @@ -258,6 +268,7 @@ export function useChat(
useChatOptions().onToolCall?.(),
useChatOptions().sendExtraMessageFields?.(),
useChatOptions().fetch?.(),
useChatOptions().keepLastMessageOnError?.(),
),
experimental_onFunctionCall:
useChatOptions().experimental_onFunctionCall?.(),
Expand All @@ -269,7 +280,7 @@ export function useChat(

abortController = null;
} catch (err) {
// Ignore abort errors as they are expected.
// Ignore abort errors as they are expected.
if ((err as any).name === 'AbortError') {
abortController = null;
return null;
Expand Down
Loading