Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix multiple parallel tool call support in streamUI #1894

Open
mfclarke-cnx opened this issue Jun 9, 2024 · 12 comments
Open

Fix multiple parallel tool call support in streamUI #1894

mfclarke-cnx opened this issue Jun 9, 2024 · 12 comments
Labels
ai/rsc bug Something isn't working

Comments

@mfclarke-cnx
Copy link

mfclarke-cnx commented Jun 9, 2024

Bug Description

streamUI doesn't gracefully handle multiple parallel tool calls. Currently when the model makes multiple tool calls, the tool handler gets called multiple times, but the first call to resolve closes the UI steam and the rest throw:

⨯ node_modules/.pnpm/ai@3.1.26_openai@4.48.1_react@18.3.1_solid-js@1.8.17_svelte@4.2.17_vue@3.4.27_zod@3.23.8/node_modules/ai/rsc/dist/rsc-server.mjs (1231:12) @ assertStream
 ⨯ unhandledRejection: Error: .update(): UI stream is already closed.
@mfclarke-cnx mfclarke-cnx changed the title streamUI doesn't support multiple parallel tool calls Support multiple parallel tool calls in streamUI Jun 9, 2024
@lgrammel lgrammel added enhancement New feature or request ai/rsc labels Jun 11, 2024
@camwest
Copy link

camwest commented Jun 11, 2024

It looks like this hardcoded "true" is where the error is happening.

It's a bad assumption. The tool-call event doesn't immediately mean it's the last render call because there may be multiple tool-call events.

@camwest
Copy link

camwest commented Jun 11, 2024

A potential fix would be to add a 'all-tool-calls-done' event which gets pushed to stream after the loop right around here:

Then we handle that in stream-ui and finalize the stream there.

The challenge is we'd have to update anything that implements LanguageModelV1 to trigger this event, or version to V2.

@camwest
Copy link

camwest commented Jun 11, 2024

Another option is to close the stream on finish:

flush(controller) {
controller.enqueue({
type: 'finish',
finishReason,
logprobs,
usage,
});

Currently not implemented in streamUI:

// Nothing to do here.

@bneigher
Copy link

ah this is the culprit of my issue! been doing this since v3.1.25

@camwest
Copy link

camwest commented Jun 11, 2024

@mfclarke-cnx can you change this to a bug instead of enhancement?

@mfclarke-cnx
Copy link
Author

@camwest unfortunately I don't have permissions to change the labels. @lgrammel would you be able to update labels please?

@mfclarke-cnx mfclarke-cnx changed the title Support multiple parallel tool calls in streamUI Fix multiple parallel tool call support in streamUI Jun 11, 2024
@MaxLeiter MaxLeiter added bug Something isn't working and removed enhancement New feature or request labels Jun 12, 2024
@miguelvictor
Copy link
Contributor

I also have this issue but the weird thing about mine is the same tool is being called twice.

@gclark-eightfold
Copy link
Contributor

As tagged by @bneigher, I have an approach I'm working on in #1904, but it's still a WIP.

Main issue I have right now is that because we don't know how many tool calls will be made ahead of time, the parent "streamableUI" doesn't call .done until all tool calls complete.

I think my expectation is that the UI will be added to as the final tool calls stream in.

@camwest
Copy link

camwest commented Jun 13, 2024

Main issue I have right now is that because we don't know how many tool calls will be made ahead of time, the parent "streamableUI" doesn't call .done until all tool calls complete.

I think you can get this information at the provider level.

For example the OpenAI provider passes an array of tool_calls so you know when you're finished iterating that it's 'done'

@HECHONG999
Copy link

"Parallel support, how to resolve the 'done' error bug now."

@dutsik-p
Copy link

dutsik-p commented Jul 2, 2024

Maybe anybody knows how to tune the prompt so the model doesnot put number of tools to call at the same time?

@grrowl
Copy link

grrowl commented Aug 2, 2024

As a kind of "workaround" you can set parallelToolCalls to false (as introduced in #1937) — but lack of parallel tool calling support really should be mentioned on https://sdk.vercel.ai/docs/troubleshooting/common-issues

Frustrating that this common benefit of AI tool calling is broken on Vercel AI UI :(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai/rsc bug Something isn't working
Projects
None yet
Development

No branches or pull requests

10 participants