Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat (rsc): add streamUI onFinish callback #1920

Merged
merged 13 commits into from
Jun 19, 2024

Conversation

gclark-eightfold
Copy link
Contributor

@gclark-eightfold gclark-eightfold commented Jun 11, 2024

Summary

Adds an onFinish callback to streamUI.

Resolves #1845

Tasks

  • implementation
  • test
  • example
  • docs
    • reference
    • example
  • changeset

@gclark-eightfold gclark-eightfold changed the title feat (core): add streamUI onFinish callback feat (rsc): add streamUI onFinish callback Jun 11, 2024
@gclark-eightfold
Copy link
Contributor Author

For the "test", "example", and "docs:example" tasks, I couldn't find the right places to extend for the streamUI api. Please let me know if any of these are blocking for merge.

@gclark-eightfold gclark-eightfold marked this pull request as ready for review June 11, 2024 17:26
gclark-eightfold added a commit to gclark-eightfold/ai that referenced this pull request Jun 11, 2024
@gclark-eightfold
Copy link
Contributor Author

@lgrammel If you get a chance, can you review these changes?

@lgrammel
Copy link
Collaborator

@gclark-eightfold general direction looks good. tests would be important, since i'm thinking about refactoring the inner loop of streamUI to a stream similar to streamText at some point

@gclark-eightfold
Copy link
Contributor Author

gclark-eightfold commented Jun 13, 2024

@lgrammel I'll see about adding some tests then soon. I did see there were some already made for the precursor render function, so I may adapt / take some pointers from those unless there's any better examples to model off of.

Edit: I actually see now the E2E tests already made for streamUI. Let me know if these are sufficient.

@lgrammel
Copy link
Collaborator

@gclark-eightfold I have concerns going all in on e2e, bc of the nature of e2e tests. I'd love to have tests similar to streamText, but that requires a refactoring on my end

@Godrules500
Copy link

Thanks for working on this! I'm looking forward to this one since it's preventing me from using streamUI!

@gclark-eightfold
Copy link
Contributor Author

@lgrammel I re-added some of the tests I found for the render function that were removed in this PR in favor of E2E tests
https://github.com/vercel/ai/pull/1825/files#diff-ba5d05e418f64a2fa3c1e20b39bc9378e4d29218f46156f7ac95fac0b1559e56

I can remove the snapshot tests that aren't related to the onFinish callback if preferred since I believe those should be covered in the E2E tests, but just LMK what your preference is there.

@lgrammel
Copy link
Collaborator

For the "test", "example", and "docs:example" tasks, I couldn't find the right places to extend for the streamUI api. Please let me know if any of these are blocking for merge.

The example could go under examples/next-openai/app (new api route / chat frontend)
The docs example could be a new page under content/examples/01-next-app/01-basics

@lgrammel
Copy link
Collaborator

@lgrammel I'll see about adding some tests then soon. I did see there were some already made for the precursor render function, so I may adapt / take some pointers from those unless there's any better examples to model off of.

Edit: I actually see now the E2E tests already made for streamUI. Let me know if these are sufficient.

It's fine as is for now.

@gclark-eightfold
Copy link
Contributor Author

@lgrammel Should be all added now.
I opted to add the onFinish documentation under content/examples/05-interface/03-token-usage.mdx to match that of content/examples/03-node/02-streaming-structured-data/10-token-usage.mdx

Let me know if this should be changed in any way. Also wasn't able to figure out how to run a server for the MDX content to verify it all works, but I tried to keep it as close to what exists already.

@lgrammel
Copy link
Collaborator

@gclark-eightfold looks very good. I'll do some testing, also one the docs.

Copy link
Collaborator

@lgrammel lgrammel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! Awesome work!

@lgrammel lgrammel merged commit 520fb2d into vercel:main Jun 19, 2024
7 of 8 checks passed
@gclark-eightfold gclark-eightfold deleted the on-finish-stream-ui branch June 20, 2024 18:30
@Godrules500
Copy link

I've been playing with this, but I was curious, since onFinish cannot have a return like the text callback, how can I display that a finish reason violated the safety settings?

@lgrammel
Copy link
Collaborator

I've been playing with this, but I was curious, since onFinish cannot have a return like the text callback, how can I display that a finish reason violated the safety settings?

if you use streamable value or streamable ui, you can update them from within the onFinish callback

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Update StreamUI to include finishReason, usage, and any other relevant data
3 participants