-
Notifications
You must be signed in to change notification settings - Fork 161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ReadableStream should be an async iterable #778
Comments
https://jakearchibald.com/2017/async-iterators-and-generators/#making-streams-iterate - article on this, including basic implementation. |
Excellent! I really look forward to this! |
There are two approaches to this:
After realizing we'd have to make |
Further questions that have come up as I contemplate this all a bit more seriously:
I'm leaning toward auto-canceling and auto-releasing, because otherwise, cleanup is quite annoying: try {
for await (const chunk of rs) {
await somethingThatMightReject(chunk);
}
} finally {
try {
// Might throw if the reader is still locked because we finished
// successfully without breaking or throwing.
await rs.cancel();
} catch {}
} It's starting to feel a bit magic though.... |
Here is the implementation that I currently use to convert a ReadableStream to an async iterator, and it works well enough for what I need from it. I don't think I ever cancel iterating the stream though, maybe only when throwing an exception in a loop. I also agree that it should be auto cancelled and auto closed, as (async) iterators most often are single consumer. I think the api should be optimized for single consumer and require a small wrapper for cases where it should not automatically close. That seems like the most common scenario, and it doesn't make it impossible to get the multi consumer scenario to work. |
I prefer wrapper object and auto-cancel. Not auto-cancelling leads to simpler code when your input has a distinct header and body. You can break out of the first loop when you get to the end of the header, and then have a second loop which processes the body. This is a niche use case, and there's other ways of doing it. The rest of the time, auto-cancel is cleaner and less error-prone. I don't think I favour auto-release. Assuming we have auto-cancel and auto-close, then there are extremely limited extra capabilities you'd gain from auto-release:
Maybe auto-release has some aesthetic benefits I haven't considered. |
Wrapper object seems like a clear winner. I didn't like the sound of auto-cancel, but given @domenic's code example it sounds like the better thing to do. We could have: for await (const chunk of stream.iterator({ preventClose: true })) {
// …
} Then |
@jakearchibald @domenic I would like Node.js One thing that is not obvious, is if we should close the stream when there is a |
@mcollina see the above comments for API discussion so far. Is there somewhere we can read up on the justifications and tradeoffs for the pattern node is shipping? |
|
@jakearchibald I added AsyncIterators based on what I thought it might make sense for Node.js Streams, we also did a couple of implementations and this turned out to be more performant*. It's an experimental feature that is shipping for the first time in 2 weeks (it will print a warning and notify the users). We can definitely change any part of what I did in implementation before it gets out of experimental (ideally before Node 10 goes LTS in October, but even later if things are in flux).
I guess my implementation of
|
@devsnek has generously volunteered to help with the spec and tests here, and I'm very excited. Summing up some IRC discussion:
|
In a large code base using our own stream->to->async iterator transition we did this and are pretty happy with that as a default.
Definitely agree from our perspective.
We also do this and find it quite useful. It is also easy to opt out of even without providing
That said, I am not objecting to a I've also been using async iterators with Node streams which have been working well - though I admit a lot less than I'd like to - I have been trying to (as in emailing relevant parties) solicit feedback but haven't been able to get it. We hope to solicit feedback through a Node.js foundation survey which we hope to send in about a month. |
Given the precedent that is somewhat being established in WICG/kv-storage#6, I am wondering whether we should rename |
I prefer |
Uhm, at the moment it's called |
I forgot to file implementer bugs yesterday. Here they are: |
I have some code like this. But vscode show some errors ( const response = await fetch(`${download_url_prefix}${node_executable_filename}`)
const contentLength = +(response.headers.get('Content-Length') ?? 0);
const reader = response.body?.getReader();
if(!reader) {
throw new Error(`Failed to get reader from response body`);
}
let receivedLength = 0;
let chunks = [];
for await (let {done, value} of reader.read()) {
chunks.push(value);
receivedLength += value.length;
console.log(`Received ${receivedLength} of ${contentLength}`)
} I have to rewrite the above while (true) {
const { done, value } = await reader.read();
if (done) {
break;
}
chunks.push(value);
received_length += value.length;
console.log(`Received ${receivedLength} of ${contentLength}`)
} |
@liudonghua123 You shouldn't use const readable = response.body;
if (!readable) {
throw new Error('Failed to get response body');
}
let receivedLength = 0;
let chunks = [];
for await (const value of readable) {
chunks.push(value);
receivedLength += value.length;
console.log(`Received ${receivedLength} of ${contentLength}`)
} MDN has more examples on async-iterating a stream. 😉 (Note that this GitHub project is not really a support forum. For future questions, I would recommend searching on e.g. StackOverflow, where there are already answers for similar questions.) |
@MattiasBuelens Thanks, I see now. 😄 |
(I'm adding this mostly as a tracking bug after I asked about it in the wrong place)
NodeJS is looking into supporting async iterables as a way to stream data, and it would be great if fetch (or the readable stream part of fetch) supported the same interface. This would make easier to move code between nodejs and the browser.
The text was updated successfully, but these errors were encountered: