-
Notifications
You must be signed in to change notification settings - Fork 33
Rationale for "Passing the protocol" #122
Comments
I'm also unsure about the meaning of |
(6 months later) Hi! Thanks for this comment. The broad strokes of this decision were agreed on at committee. Some details may change. Let's try and puzzle out what the rationale really is. If it seems good, or at least better than nothing, we can add it to the details page—maybe even a NOTE in the specification. Before I start, let me just say: I 100% sympathize with your points here. To me, like you, iterators are simple and they're about data, while generators are... well, it's a complex and subtle API, and it's not super clear what it's about. The coroutine protocol requires some extra cooperation between the coroutine and its user, to be useful—unlike iteration, where the iterator and the loop that consumes it are often completely independent. For now I only have time to dispense with the easiest parts of what you're raising here. I'll have to continue later.
I hope this helps. It has been a while since I looked at this proposal, but I think it's good and @ystartsev and I hope to get it moving again. |
I should have started with this: I think the proposal has been largely rewritten since that section of DETAILS.md was written. I'm not sure the proposal still forwards I'll have to check the details. |
The proposal is being rewritten on top of generator primitives, so it still "passes the protocol" or so to speak, the same way generators do. |
Thanks for the reply. I pretty much agree with what you're saying. You also note that the spec may be ahead of the markdown, so I'm trying to piece together what the spec really says. All it really says is that |
The other part of this is return values from Here's my simple example: You write a coroutine generator, which is to say a generator whose progression may be influenced by return values. Here's a simple one that yields numbers but lets you skip some: function * range() {
for (let i = 0;; i++) {
const nexti = yield i;
if (nexti !== undefined) i = nexti;
}
} Now I function *() {
const iter = range().map(i => -i);
let i = iter.next().value;
while(true) yield i = iter.next(i - 1).value;
} But of course this code is broken. Instead of causing the range generator to skip every other value, we instead cause it to always yield its initial value. In my mind there are two options. First you can just not try to pass yield returns through map. The code can then be changed to respect the API as defined by the real coroutine, and the map step just becomes inlined. function *() {
const iter = range();
let i = -iter.next().value;
while(true) yield i = iter.next(i + 1).value;
} The second option would be to modify the map api to allow specification of a reciprocal mapper, i.e. a transformation for yield return values: function *() {
const iter = range().map(i => -i, i => -i); // note the second callback: the yield return transformer
let i = iter.next().value;
while(true) yield i = iter.next(i - 1).value;
} |
Actually better I think would be to combine both suggestions.
|
Regarding
|
Hey, I don't fully understand the issue here about passing return/throw, but I wanted to make sure my use case was supported even if that was dropped. The use case is stream async iterators, where for await (const chunk of stream) {
break;
}
// now stream is canceled Can we be assured that this property will be preserved even when iterator helpers are involved? I.e. the following works? for await (const chunk of stream.values().map(...).filter(...)) {
break;
}
// now stream is canceled |
from @bakkot :
Effectively the same concern as @domenic but the change in wording made the concern more concrete for me. |
Another concern here from @hax
|
Yes, absolutely. This is about something different. You might not know this but generators allow you to do tricky things like Also if you haven't, check out iter-tools because it's the only library I'm aware of that can transform sync or async iterables and guarantees (100% test coverage) that |
@domenic Your use case will definitely continue to work. Everyone agrees that propagating |
@conartist6 This refers to a new proposal that is not yet accepted. See @tc39/proposal-deiter. The proposal uses Passing this "option" argument through iterator helpers would make sense for |
I think that in this case, the better solution would not be a value passed to next, but possibly a new method? |
Yes I already found that but I can find no corroboration for the statement "deiter (double ended iterators) rely on param in next(param)". Edit: OK found it here Reversing an iterator is not the same thing as reversing the direction of data flow, that is to say that even if you take the values in your iterable in reverse, you're still going from data => iterator/transform/consumer. |
Also I am strongly opposed to double ended iterators, but I guess I better go over there and explain why, or what I mean by that. |
@codehag It's possible to use a new method, but consider how generator works, if we want to support writing deiter via generators and keep compatibility with single direction iterator, we need to pass param to PS. Rust deiter actually use a separate method named |
Note, as tc39/proposal-deiter#11 , if deiter proposal move to separate Actually it's trivial to make iterator helpers work for deiter, but I found that speccing them by "Abstract Closure" are even harder than the traditional ways. |
@michaelficarra and I spent a good while thinking about this and ultimately came to basically the same conclusion as the OP: iterator helpers are necessarily going to break many uses of the non-iteration parts of the generator protocol, so they shouldn't even try to preserve those parts. In #194 we removed the implementations of
|
The details page says "There are number of decisions which could be made differently. This document attempts to catalog them along with the rationales for the choices currently made." It does not however offer any rationales.
I'm particularly interested in the rationale behind the "passing the protocol" choice made for this API. I think of the iterator spec as having been designed for separate usages: coroutines and data transformation. In data transformation being able to give arguments to next is useless. Control flows only from the data.
So why are we transforming coroutines? If I understand we're basically talking about being able to write this line in a generator:
const arg = yield result
. Taken this way it seems to indicate some feedback from the consumer about the result yielded. But this only really makes sense to me if you know the consumer got the result you gave. The generator won't know if it's being mapped, so how could it be sensible for the consumer (who won't know what the generator yielded prior to the map operation) to give feedback?filter
is even more of a head-scratcher as the generator of values won't know whether the feedback given was about the value it just emitted or about the last value it emitted that passed thefilterer
predicate (which it doesn't know exists). In addition, these operations (anddrop
andtake
) can be combined, with the effect that while a value input to theiterator
chain will come out somewhere, it would seem impossible to reason about where, and especially not without knowing the whole chain of transformations inside the coroutine, in which case they should just be expressed using normal control flow logic: function calls instead ofmap
and if statements instead offilter
.I propose removing this functionality, particularly the updated definition of IteratorNext
The text was updated successfully, but these errors were encountered: