Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sending stream responses (experimental) #1327

Closed
pi0 opened this issue Jun 20, 2023 · 13 comments
Closed

Sending stream responses (experimental) #1327

pi0 opened this issue Jun 20, 2023 · 13 comments
Labels

Comments

@pi0
Copy link
Member

pi0 commented Jun 20, 2023

Moving from (#19)

Coming with h3@1.7.0 and new event.handled flag it is possible to take over h3 response mechanism and send streams. For Workers, we can check res._body property existence. Direct fetch in unenv (src) passes the body as is.

This way we can implement sendStream with the minimum possible changes. Once tested enough and working for common presets, I plan to add this as a built-in (nitro/h3) utility.

Example usage: https://github.com/unjs/nitro-deploys/blob/main/routes/stream.ts

Working deployments:

Screen.Recording.2023-06-21.at.02.56.44.mov
@pi0 pi0 added enhancement New feature or request workaround available labels Jun 20, 2023
@pi0 pi0 changed the title Sending stream responses (experimental workaround) Sending stream responses (experimental) Jun 20, 2023
@Hebilicious
Copy link
Contributor

This is approach is working great for me, see here.

@pi0 Could we replace the sendStream in h3 with this new updated version instead of adding it to nitro ?

@jlucaso1
Copy link

I'm experiencing problems detecting when the request is closed to clean up processing.

  event.node.req.once("close", () => {
    cleanupFunction() // <-- this is never called
  })

@pi0 can you update the example of streams to handle this? I don't know if this is a bug.

@dosstx
Copy link

dosstx commented Jun 30, 2023

So how does one set this up for Nuxt ?

@Hebilicious
Copy link
Contributor

I'm experiencing problems detecting when the request is closed to clean up processing.

  event.node.req.once("close", () => {
    cleanupFunction() // <-- this is never called
  })

@pi0 can you update the example of streams to handle this? I don't know if this is a bug.

For node.js, you could pass a cleanup function to your sendStream handler, and run it as part of the close() callback of WritableStream.

function sendStream(event: H3Event, stream: ReadableStream, cleanup: any) {
  // Mark to prevent h3 handling response
  event._handled = true;

  // Workers (unenv)
  (event.node.res as unknown as { _data: BodyInit })._data = stream;

  // Node.js
  if (event.node.res.socket) {
    stream.pipeTo(
      new WritableStream({
        write(chunk) {
          event.node.res.write(chunk);
        },
        close() {
         cleanup()
          event.node.res.end();
        },
      })
    );
  }
}

@jlucaso1
Copy link

@Hebilicious
I attempted this approach; however, when I forcefully close the request, the cleanup() function does not get invoked as expected.

Copy link
Contributor

@jlucaso1 Try abort()

@pi0
Copy link
Member Author

pi0 commented Jul 10, 2023

Update: Web Stream Support directly landing in H3 (unjs/h3#432).

We still might need to tweak some presets for full compatibility and experiment worker support.

@dosstx
Copy link

dosstx commented Jul 27, 2023

Can we use this yet in Nuxt or not yet? Confused.

@pi0
Copy link
Member Author

pi0 commented Jul 27, 2023

@dosstx h3 changes are still not released and being tested. But very soon you would be able to try it via nuxt or nitro edge channels.

@dosstx
Copy link

dosstx commented Jul 27, 2023

@pi0 OK thank you very much. Sorry to waste your time looking at these kinds of questions. Looking forward to the release!

@Hebilicious
Copy link
Contributor

@pi0 OK thank you very much. Sorry to waste your time looking at these kinds of questions. Looking forward to the release!

You can already use it with a few workarounds: vercel/ai#295
As pi said, soon you will be able to use it without the workarounds.

@Giancarlo-Ma
Copy link

Giancarlo-Ma commented Dec 9, 2023

how can i intercept the streaming content to save it to db on cf worker

I have tried this:
`
const reader = stream.getReader();
let { readable, writable } = new TransformStream();
// Workers (unenv)
// @ts-expect-error _data will be there.
event.node.res._data = readable;
const writer = writable.getWriter();
// pipe stream to writable and log the chunk content use reader
while (true) {
const { done, value } = await reader.read();

if (done) {
  console.log("Stream complete");
  await writer.close();

  break;
}
const textDecoder = new TextDecoder("utf-8");
console.log(textDecoder.decode(value));
await writer.write(value);

}
`
but only get one whole pice after a while, not real streaming in front end

@pi0
Copy link
Member Author

pi0 commented May 16, 2024

streaming support should be more stable now. please report any issues if you encounter so we can improve 🙏🏼

@pi0 pi0 closed this as completed May 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants