Skip to content

Commit

Permalink
fix (ai/ui): keep last message in useChat on error (#2262)
Browse files Browse the repository at this point in the history
Co-authored-by: AntzyMo <mozbano@163.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
  • Loading branch information
4 people authored Jul 15, 2024
1 parent 92217a8 commit a6cb2c8
Show file tree
Hide file tree
Showing 18 changed files with 443 additions and 99 deletions.
10 changes: 10 additions & 0 deletions .changeset/sour-points-fold.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
---
'@ai-sdk/ui-utils': patch
'@ai-sdk/svelte': patch
'@ai-sdk/react': patch
'@ai-sdk/solid': patch
'ai': patch
'@ai-sdk/vue': patch
---

feat (ai/ui): add keepLastMessageOnError option to useChat
144 changes: 112 additions & 32 deletions content/docs/05-ai-sdk-ui/02-chatbot.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ import { useChat } from 'ai/react';

export default function Page() {
const { messages, input, handleInputChange, handleSubmit } = useChat({
api: 'api/chat',
keepLastMessageOnError: true,
});

return (
Expand All @@ -35,13 +35,9 @@ export default function Page() {
{message.content}
</div>
))}

<form onSubmit={handleSubmit}>
<input
name="prompt"
value={input}
onChange={handleInputChange}
id="input"
/>
<input name="prompt" value={input} onChange={handleInputChange} />
<button type="submit">Submit</button>
</form>
</>
Expand All @@ -50,60 +46,143 @@ export default function Page() {
```

```ts filename='app/api/chat/route.ts'
import { type CoreMessage, streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { convertToCoreMessages, streamText } from 'ai';

// Allow streaming responses up to 30 seconds
export const maxDuration = 30;

export async function POST(req: Request) {
const { messages }: { messages: CoreMessage[] } = await req.json();
const { messages } = await req.json();

const result = await streamText({
model: openai('gpt-4'),
model: openai('gpt-4-turbo'),
system: 'You are a helpful assistant.',
messages,
messages: convertToCoreMessages(messages),
});

return result.toAIStreamResponse();
}
```

In the `Page` component, the `useChat` hook will request to your AI provider endpoint whenever the user submits a message. The messages are then streamed back in real-time and displayed in the chat UI.
In the `Page` component, the `useChat` hook will request to your AI provider endpoint whenever the user submits a message.
The messages are then streamed back in real-time and displayed in the chat UI.

This enables a seamless chat experience where the user can see the AI response as soon as it is available,
without having to wait for the entire response to be received.

This enables a seamless chat experience where the user can see the AI response as soon as it is available, without having to wait for the entire response to be received.
<Note>
`useChat` has a `keepLastMessageOnError` option that defaults to `false`. This
option can be enabled to keep the last message on error. We will make this the
default behavior in the next major release. Please enable it and update your
error handling/resubmit behavior.
</Note>

## Customized UI

`useChat` also provides ways to manage the chat message and input states via code, show loading and error states, and update messages without being triggered by user interactions.

### Loading and error states
### Loading State

To show a loading spinner while the chatbot is processing the user's message, you can use the `isLoading` state returned by the `useChat` hook:
The `isLoading` state returned by the `useChat` hook can be used for several
purposes

```tsx
const { isLoading, ... } = useChat()
- To show a loading spinner while the chatbot is processing the user's message.
- To show a "Stop" button to abort the current message.
- To disable the submit button.

return <>
{isLoading ? <Spinner /> : null}
...
```tsx filename='app/page.tsx' highlight="6,20-27,34"
'use client';

import { useChat } from 'ai/react';

export default function Page() {
const { messages, input, handleInputChange, handleSubmit, isLoading } =
useChat({
keepLastMessageOnError: true,
});

return (
<>
{messages.map(message => (
<div key={message.id}>
{message.role === 'user' ? 'User: ' : 'AI: '}
{message.content}
</div>
))}

{isLoading && (
<div>
<Spinner />
<button type="button" onClick={() => stop()}>
Stop
</button>
</div>
)}

<form onSubmit={handleSubmit}>
<input
name="prompt"
value={input}
onChange={handleInputChange}
disabled={isLoading}
/>
<button type="submit">Submit</button>
</form>
</>
);
}
```

Similarly, the `error` state reflects the error object thrown during the fetch request. It can be used to display an error message, or show a toast notification:
### Error State

```tsx
const { error, ... } = useChat()
Similarly, the `error` state reflects the error object thrown during the fetch request.
It can be used to display an error message, disable the submit button, or show a retry button.

useEffect(() => {
if (error) {
toast.error(error.message)
}
}, [error])
<Note>
We recommend showing a generic error message to the user, such as "Something
went wrong." This is a good practice to avoid leaking information from the
server.
</Note>

// Or display the error message in the UI:
return <>
{error ? <div>{error.message}</div> : null}
...
```tsx file="app/page.tsx" highlight="6,18-25,31"
'use client';

import { useChat } from 'ai/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, error, reload } =
useChat({
keepLastMessageOnError: true,
});

return (
<div>
{messages.map(m => (
<div key={m.id}>
{m.role}: {m.content}
</div>
))}

{error && (
<>
<div>An error occurred.</div>
<button type="button" onClick={() => reload()}>
Retry
</button>
</>
)}

<form onSubmit={handleSubmit}>
<input
value={input}
onChange={handleInputChange}
disabled={error != null}
/>
</form>
</div>
);
}
```

### Modify messages
Expand Down Expand Up @@ -175,6 +254,7 @@ const { reload, isLoading, ... } = useChat()
return <>
<button onClick={reload} disabled={isLoading}>Regenerate</button>
...
</>
```

When the user clicks the "Regenerate" button, the AI provider will regenerate the last message and replace the current one correspondingly.
Expand Down
90 changes: 77 additions & 13 deletions content/docs/05-ai-sdk-ui/21-error-handling.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,27 +5,91 @@ description: Learn how to handle errors in the AI SDK UI

# Error Handling

Errors can be handled by passing an [`onError`](/docs/reference/ai-sdk-ui/use-chat#on-error) callback function as an option to the [`useChat`](/docs/reference/ai-sdk-ui/use-chat), [`useCompletion`](/docs/reference/ai-sdk-ui/use-completion) or [`useAssistant`](/docs/reference/ai-sdk-ui/use-assistant) hooks.
### Error Handling Callback

Each AI SDK UI hook also returns an [error](/docs/reference/ai-sdk-ui/use-chat#error) object that you can use to render the error in your UI.
Errors can be processed by passing an [`onError`](/docs/reference/ai-sdk-ui/use-chat#on-error) callback function as an option to the [`useChat`](/docs/reference/ai-sdk-ui/use-chat), [`useCompletion`](/docs/reference/ai-sdk-ui/use-completion) or [`useAssistant`](/docs/reference/ai-sdk-ui/use-assistant) hooks.
The callback function receives an error object as an argument.

```tsx
```tsx file="app/page.tsx" highlight="7-10"
import { useChat } from 'ai/react';

const { ... } = useChat({
onError: error => {
// handle error
console.error(error);
},
});
export default function Page() {
const {
/* ... */
} = useChat({
onError: error => {
// handle error
console.error(error);
},
});
}
```

### Error Helper Object

Each AI SDK UI hook also returns an [error](/docs/reference/ai-sdk-ui/use-chat#error) object that you can use to render the error in your UI.
You can use the error object to show an error message, disable the submit button, or show a retry button.

<Note>
We recommend showing a generic error message to the user, such as "Something
went wrong." This is a good practice to avoid leaking information from the
server.
</Note>

```tsx file="app/page.tsx" highlight="7,19-26,32"
'use client';

```tsx
import { useChat } from 'ai/react';

const { error } = useChat();
if (error) return <div>{error.message}</div>;
});
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, error, reload } =
useChat({
keepLastMessageOnError: true,
});

return (
<div>
{messages.map(m => (
<div key={m.id}>
{m.role}: {m.content}
</div>
))}

{error && (
<>
<div>An error occurred.</div>
<button type="button" onClick={() => reload()}>
Retry
</button>
</>
)}

<form onSubmit={handleSubmit}>
<input
value={input}
onChange={handleInputChange}
disabled={error != null}
/>
</form>
</div>
);
}
```

### useChat: Keep Last Message on Error

`useChat` has a `keepLastMessageOnError` option that defaults to `false`.
This option can be enabled to keep the last message on error.
We will make this the default behavior in the next major release.
Please enable it and update your error handling/resubmit behavior.

### Injecting Errors for Testing

You might want to create errors for testing.
You can easily do so by throwing an error in your route handler:

```ts file="app/api/chat/route.ts"
export async function POST(req: Request) {
throw new Error('This is a test error');
}
```
6 changes: 6 additions & 0 deletions content/docs/07-reference/ai-sdk-ui/01-use-chat.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,12 @@ Allows you to easily create a conversational user interface for your chatbot app
type: "string = '/api/chat'",
description: 'The chat completion API endpoint offered by the provider.',
},
{
name: 'keepLastMessageOnError',
type: 'boolean',
description:
'Keeps the last message when an error happens. This will be the default behavior starting with the next major release. The flag was introduced for backwards compatibility and currently defaults to `false`. Please enable it and update your error handling/resubmit behavior.',
},
{
name: 'id',
type: 'string',
Expand Down
4 changes: 2 additions & 2 deletions examples/next-openai/app/api/chat/route.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
import { convertToCoreMessages, streamText } from 'ai';

// Allow streaming responses up to 30 seconds
export const maxDuration = 30;
Expand All @@ -11,7 +11,7 @@ export async function POST(req: Request) {
// Call the language model
const result = await streamText({
model: openai('gpt-4-turbo'),
messages,
messages: convertToCoreMessages(messages),
async onFinish({ text, toolCalls, toolResults, usage, finishReason }) {
// implement your own logic here, e.g. for storing messages
// or recording token usage
Expand Down
Loading

0 comments on commit a6cb2c8

Please sign in to comment.