Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not that great for implementing transducers? #48

Closed
jedwards1211 opened this issue Dec 26, 2019 · 31 comments
Closed

Not that great for implementing transducers? #48

jedwards1211 opened this issue Dec 26, 2019 · 31 comments
Labels
documentation Improvements to docs are requested question Further information is requested

Comments

@jedwards1211
Copy link
Contributor

Since the API is so similar to Observable, I find it great for consuming observable-like things but I'm not as sold for using it on non-observable-like things, like transducers or other async iterator combiners.

It seems almost simpler to implement transducers like map correctly using a raw async iterator protocol than using Repeater.

The naive assumption (in repeaterMap below) is to use a for await loop, but this keeps consuming input after the caller returns, which would be especially bad if the input is infinite. Whereas a raw async iterator can immediately forward return and throw calls to the input.

A less naive attempt (awkwardRepeaterMap) is to do the for await in a separate function and break once stop is detected. However more calls to next() still go through than with the raw async iterator.

Thoughts?

Code

const { Repeater } = require('@repeaterjs/repeater')

const range = n => ({
  i: 0,
  n,
  async next() {
    console.log('next()')
    const i = ++this.i
    if (i >= this.n) return { done: true }
    return { value: i, done: false }
  },
  async return() {
    console.log('return()')
    this.i = this.n
    return { done: true }
  },
  async throw(error) {
    console.log('throw()')
    this.i = this.n
    return { done: true }
  },
  [Symbol.asyncIterator]() {
    return this
  },
})

const repeaterMap = (iterable, iteratee) =>
  new Repeater(async (push, stop) => {
    for await (const value of iterable) push(iteratee(value))
    await stop
  })

const rawMap = (iterable, iteratee) => ({
  iterator: iterable[Symbol.asyncIterator](),
  async next() {
    const { value, done } = await this.iterator.next()
    return { value: iteratee(value), done }
  },
  return() {
    return this.iterator.return()
  },
  throw(error) {
    return this.iterator.throw(error)
  },
  [Symbol.asyncIterator]() {
    return this
  },
})


const awkwardRepeaterMap = (iterable, iteratee) =>
  new Repeater(async (push, stop) => {
    let stopped = false
    async function next() {
      for await (const value of iterable) {
        if (stopped) break
        push(iteratee(value))
      }
    }
    next()
    await stop
    stopped = true
  })

async function go() {
  console.log('rawMap')
  for await (const i of rawMap(range(10), i => i * 2)) {
    console.log(i)
    if (i >= 5) break
  }
  console.log('\nrepeaterMap')
  for await (const i of repeaterMap(range(10), i => i * 2)) {
    console.log(i)
    if (i >= 5) break
  }
  console.log('\nawkwardRepeaterMap')
  for await (const i of awkwardRepeaterMap(range(10), i => i * 2)) {
    console.log(i)
    if (i >= 5) break
  }
}

go()

Output

rawMap
next()
2
next()
4
next()
6
return()

repeaterMap
next()
next()
next()
2
next()
next()
next()
4
next()
next()
6
next()
next()

awkwardRepeaterMap
next()
next()
next()
2
next()
next()
next()
4
next()
next()
6
next()
return()
@jedwards1211
Copy link
Contributor Author

jedwards1211 commented Dec 26, 2019

To be fair, my rawMap implementation is probably a bit flawed as far as return waiting for calls to next to finish. (Nevermind, it's fine because it just defers return to its input). But in any case, it seems like an ideal map implementation shouldn't eagerly pull values from its input.

@brainkim
Copy link
Member

brainkim commented Dec 26, 2019

@jedwards1211 have you tried awaiting the push call in your repeater examples? Without any sort of pausing mechanism within the for await loop, you’re simply unspooling the range iterator as fast as you can. Your raw iterator runs step-wise through the range iterator because it only calls next once per iteration. In other words do something like this:

const repeaterMap = (iterable, iteratee) =>
  new Repeater(async (push, stop) => {
    for await (const value of iterable) await push(iteratee(value))
    await stop
  })

I write about what the promise returned from push does in the overview:

In addition, the executor API exposes promises which resolve according to the state of the repeater. The push function returns a promise which resolves when next is called, and the stop function doubles as a promise which resolves when the repeater is stopped. As a promise, stop can be awaited to defer event listener cleanup.

const repeater = new Repeater(async (push, stop) => {
  console.log("repeater started!");
  await push(1);
  console.log("pushed 1");
  await push(2);
  console.log("pushed 2");
  await stop;
  console.log("done");
});
(async () => {
  console.log(await repeater.next());
  // repeater started!
  // { value: 1, done: false }
  console.log(await repeater.next());
  // "pushed 1"
  // { value: 2, done: false }
  console.log(await repeater.return());
  // "pushed 2"
  // "done"
  // { done: true }
})();

@brainkim
Copy link
Member

brainkim commented Dec 26, 2019

Also some additional thoughts:

  1. I don’t think you need repeaters to express a map combinator, async generators are more than sufficient:
async function *map(iterable, iteratee) {
  for await (const value of iterable) yield iteratee(value)
}

I also don’t export any combinator methods directly on the async iterator interface, cause this would widen the surface API, most combinators are expressible using async generators, and the iterator helper proposal would obsolete/clash with any instance methods I defined on the repeater class.

  1. If you don’t have any cleanup code, you don’t need to await stop. This is usually an indication that you don’t need a repeater in the first place. Literally every single case where I’ve needed a repeater and not an async generator, I’ve used both the push and stop functions/promises.

@jedwards1211
Copy link
Contributor Author

Using an async generator suffers the secure stop issue though, and is especially a problem with GraphQL subs at the moment. I ended up using a raw iterator for map cases to solve the memory leaks I was having, because it's the only way I can ensure return/throw get forwarded to the input and stop the underlying redis subs right away.

I didn't realize I can await push, that makes sense!

@jedwards1211
Copy link
Contributor Author

Cool so this works well:

const repeaterMap = (iterable, iteratee) =>
  new Repeater(async (push, stop) => {
    let stopped = false
    stop.then(() => stopped = true)
    for await (const value of iterable) {
      if (stopped) break
      await push(iteratee(value))
    }
    await stop
  })

I didn't make it very clear but even though this was a toy example that doesn't need cleanup, I was trying to test whether cleanup would occur based upon either the input being return()ed or iterated to completion. (I didn't realize until doing this testing that for await only calls return() if iteration was stopped prematurely)


repeaterMap
next()
2
next()
4
next()
6
next()
return()

@brainkim
Copy link
Member

brainkim commented Dec 26, 2019

Using an async generator suffers the secure stop issue though, and is especially a problem with GraphQL subs at the moment.

I think you said it best when you said:

Also FWIW, I don't know if I would say that async generators are the main problem here, if your async iterator creator function registers stuff before the first call to next(), the danger exists whether you're consuming the iterator with an async generator, for await loop, or even using the low level async iterator API.

tc39/proposal-async-iteration#126 (comment)

Unless I’m misunderstanding your secure stop problem, repeaters wouldn’t help you here anyways because the repeater executor won’t execute until the first time next is called, so there‘s always the possibility that even when wrapping your iterator in a repeaterMap call, you still never reach the break statement in the for await loop. The answer is to go directly to the source, and fix them so they work lazily. If you can’t, you either have to manually thread some kind of return signal or “prime” the downstream repeater so that its executor is executed at least once.

Lemme know if you have any other questions or concerns or if anything is unclear.

@brainkim brainkim added documentation Improvements to docs are requested question Further information is requested labels Dec 26, 2019
@jedwards1211
Copy link
Contributor Author

jedwards1211 commented Dec 26, 2019

Here's some test code I wrote to illustrate the secure stop problem you're talking about when using my revised repeaterMap function.

The subscribe function simulates a GraphQL subscription. Basically, we want to unsubscribe from Redis ASAP after calling return() on the async iterator. This works when using the redis async iterator directly or when wrapping it in a raw async iterator map. But when I wrap with the revised repeaterMap implementation, it doesn't unsubscribe from Redis until the next event comes through -- same problem as an async generator would have.

Theoretically I could implement map with a Repeater in such a way that it can pass return() calls to the input iterator immediately, but at that point I might as well just use a raw async iterator.

Code

const { RedisPubSub } = require('graphql-redis-subscriptions')
const { Repeater } = require('@repeaterjs/repeater')

const debug = require('debug')('pubsub')

class LoggingPubSub extends RedisPubSub {
  publish(topic, ...args) {
    debug('  publish', topic)
    return super.publish(topic, ...args)
  }
  subscribe(trigger, ...args) {
    debug('  subscribe', trigger)
    return super.subscribe(trigger, ...args)
  }
  unsubscribe(subId) {
    const [triggerName = null] = this.subscriptionMap[subId] || []
    debug('  unsubscribe', triggerName)
    return super.unsubscribe(subId)
  }
}

const pubsub = new LoggingPubSub()

const repeaterMap = (iterable, iteratee) =>
  new Repeater(async (push, stop) => {
    let stopped = false
    stop.then(() => (stopped = true))
    for await (const value of iterable) {
      if (stopped) break
      await push(iteratee(value))
    }
    await stop
  })

const rawMap = (iterable, iteratee) => ({
  iterator: iterable[Symbol.asyncIterator](),
  next() {
    return this.iterator
      .next()
      .then(i => (i.done ? i : { value: iteratee(i.value), done: i.done }))
  },
  return() {
    return this.iterator.return()
  },
  throw(error) {
    return this.iterator.throw(error)
  },
  [Symbol.asyncIterator]() {
    return this
  },
})

async function subscribe(iterable, time) {
  const iterator = iterable[Symbol.asyncIterator]()
  async function iterate() {
    iterator.next().then(({ value, done }) => {
      if (!done) iterate()
    })
  }
  iterate()
  await new Promise(resolve =>
    setTimeout(() => {
      debug('  calling return()')
      iterator.return()
      resolve()
    }, time)
  )
}

async function go() {
  let i = 0
  setInterval(() => pubsub.publish('foo', i++), 10000)
  console.log('without async generator')
  await subscribe(pubsub.asyncIterator('foo'), 2000)

  console.log('\nwith raw map')
  await subscribe(rawMap(pubsub.asyncIterator('foo'), i => i * 2), 2000)

  console.log('\nwith repeaterMap')
  await subscribe(repeaterMap(pubsub.asyncIterator('foo'), i => i * 2), 2000)
}

go()

Output

$ DEBUG=pubsub node index.js 
without async generator
  pubsub   subscribe foo +0ms
  pubsub   calling return() +2s
  pubsub   unsubscribe foo +1ms

with raw map
  pubsub   subscribe foo +2ms
  pubsub   calling return() +2s
  pubsub   unsubscribe foo +0ms

with repeaterMap
  pubsub   subscribe foo +2ms
  pubsub   calling return() +2s
  pubsub   publish foo +4s
  pubsub   unsubscribe foo +3ms

@brainkim
Copy link
Member

brainkim commented Dec 26, 2019

Hmmmm so I read through the code and went on a walk to think about it and here’s my interpretation:

The problem isn’t repeaters or async generators but for await. As you’ve noted elsewhere, the next and return calls happen sequentially in a desugared for await loop, so you have to call the next and return fns manually if you want combinators to transparently forward return calls to the source concurrently.

In addition, however, this partly works because the raw map async iterator has potentially surprising behavior where calls to next/return don’t settle in order. In other words, in the following code:

(async () => {
  const iter = rawMap(sourceIter, (i) => i * 2);
  const nextP = iter.next();
  const returnP = iter.return();
  console.log(await Promise.race([returnP, nextP]));
})();

the race might resolve to returnP, depending on the behavior of sourceIter. For the most part, you’ll never notice this because sourceIter will usually settle iterations in order, but you’re noticing it here because the source also does not settle iterations in order:

https://github.com/davidyaha/graphql-redis-subscriptions/blob/master/src/pubsub-async-iterator.ts

If my reading of the code is correct, next and return for PubsubAsyncIterator both await subscriptionIds, but next additionally awaits this.pullValue, and while return will resolve all values in the pullQueue, thereby causing all next calls to resolve, it does not wait for all the handlers of next to fire, so it fulfills before the next. You can test this assumption by doing something like:

import { RedisPubSub } from 'graphql-redis-subscriptions';
const pubsub = new RedisPubSub();
const iter = pubsub.asyncIterator("SOMETHING_CHANGED");
const p1 = iter.next();
const p2 = iter.return("I WON");
Promise.race([p2, p1]).then((value) => console.log(value));
// {value: "I WON", done: true}

If the above code doesn’t print what I expect then I’ve misunderstood the graphql-redis-subscriptions code (sorry I don’t feel like spinning up a redis instance right now to test 😛).

@brainkim
Copy link
Member

I would probably go inform the async-iterator-helper peeps of this possible complication with using for await for combinators but I have no desire to read through the spec, and the source for core-js is labyrinthian and github search is terrible. Maybe they call next/return manually.

@brainkim
Copy link
Member

brainkim commented Dec 26, 2019

Theoretically I could implement map with a Repeater in such a way that it can pass return() calls to the input iterator immediately, but at that point I might as well just use a raw async iterator.

Even if you call next/return manually, repeaters can still be helpful here because they make iterations settle in order and implement the next/return/throw methods in a way that is indistinguishable from async generators. I, for instance, implement theRepeater.race, Repeater.merge, Repeater.zip, and Repeater.latest combinators as static methods which call next/return manually. I guarantee you will not find a clearer/bug-free implementation of these methods elsewhere, and they’re virtually indistinguishable from async generators to boot.

https://github.com/repeaterjs/repeater/blob/master/packages/repeater/src/repeater.ts#L502-L724

@jedwards1211
Copy link
Contributor Author

jedwards1211 commented Dec 26, 2019

So you're right that that code could probably be improved, but in PubSubAsyncIterator, this.subscribeAll() memoizes itself into this.subscriptionIds, and emptyQueue awaits this.subscriptionIds, so return basically ends up awaiting the same promise. I'm 99% sure looking at the code that this.subscriptionIds gets sets synchronously before the call to return(). And I assume when this.subscriptionIds resolves, next() will get resumed and call pullValue() before emptyQueue gets resumed?

@brainkim
Copy link
Member

brainkim commented Dec 26, 2019

Ahh I see this.listening is set to false before emptyQueue awaits this.subscriptionIds, so it causes next calls which were stuck on await this.subscribeAll to continue with a call to this.return instead of this.pullValue according to the conditional expression. However I feel like this is still 1 microtick slower than the original iter.return call so the original iter.return call should still win the race, right?

So you're right that that code could probably be improved

To be clear, I wasn’t criticizing the code, I don’t think it’s of poor quality, and it seems pretty readable as far as async code usually goes. My secret wish is that someone takes the repeater codebase and rewrites it, keeping only the unit tests, because I feel like the code has gotten pretty hairy to handle a bunch of important edge-cases.

@brainkim
Copy link
Member

Here’s how you could implement a map combinator with concurrent returns with repeaters:

function map(iterable, fn) {
  return new Repeater(async (push, stop) => {
    const iter = iterable[Symbol.asyncIterator]();
    let finalIteration;
    stop.then(() => {
      finalIteration = finalIteration || iter.return();
    });
    while (!finalIteration) {
      const iteration = await iter.next();
      if (iteration.done) {
        finalIteration = finalIteration || iteration;
        break;
      }
      await push(fn(iteration.value));
    }
    // TODO: only await finalIteration if it’s a promise
    finalIteration = await finalIteration;
    return finalIteration.value;
  });
}

Probably can be cleaned up a bit, but as you can see, repeaters are still valuable here somewhat.

@jedwards1211
Copy link
Contributor Author

Awaiting finalIteration doesn't hurt anything if it's a value does it? const foo = await 2 works perfectly fine, sets foo to 2

@jedwards1211
Copy link
Contributor Author

The main other thing about doing that with a repeater is that it seems like a lot more overhead than the raw async iterator. If fn itself is allowed to be asynchronous, maybe it's a different story( if you're saying that repeater magically ensures that return resolves after next handlers).

@brainkim
Copy link
Member

brainkim commented Dec 26, 2019

Awaiting finalIteration doesn't hurt anything if it's a value does it?

It only really matters if you’re the kind of person who counts microticks.

Actually we can clean it up a little like so:

function map(iterable, fn) {
  return new Repeater(async (push, stop) => {
    const iter = iterable[Symbol.asyncIterator]();
    let finalIteration;
    stop.then(() => {
      finalIteration = typeof iter.return === "function" ? iter.return() : {done: true};
    });
    while (!finalIteration) {
      const iteration = await iter.next();
      if (iteration.done) {
        stop();
        return iteration.value;
      }
      await push(fn(iteration.value));
    }
    // there’s no need to return finalIteration’s value here because when repeaters are returned, the return value will just be whatever was passed into the `return` method.
    await finalIteration;
  });
}

The main other thing about doing that with a repeater is that it seems like a lot more overhead than the raw async iterator. If fn itself is allowed to be asynchronous, maybe it's a different story( if you're saying that repeater magically ensures that return resolves after next handlers).

I completely respect this decision. However, here’s some things you get for free when you opt into repeaters:

  1. All iterations settle in call order.
  2. If fn is async, the repeater will unwrap the value before passing it to next (so you never get iterations like {value: Promise, done: false}).
  3. If fn/iter.next/iter.return throw an error or reject, the repeater will both handle the error so there isn’t an unhandled promise rejection, and pass it forwards to the next/return methods.

These are all really hard things to implement yourself with a raw async iterator.

@brainkim
Copy link
Member

I can keep tweaking the code snippet I have above to handle more situations, like maybe you want to race iter.next with the final iteration or whatever, and maybe you want to be able to pass an arg to the mapped iterator’s next method, and maybe you want to be able to throw an error into the inner iterator, just don’t have the time to edit it right now. If you paste what you end up with I can take a look and provide feedback.

@jedwards1211
Copy link
Contributor Author

jedwards1211 commented Dec 27, 2019

Really the only other thing I can imagine needing to do in the near future is yield an initial value before going on to the Redis events. Probably the cleanest way to do that would be to push the initial value and then pass through events from the non-async iterator API in RedisPubSub. That way I wouldn't be wrapping the push/pull queue iterator in RedisPubSub.asyncIterator with a repeater and its additional push/pull queues.

Since graphql-subscriptions and graphql-redis-subscriptions both contain push/pull async iterator utility classes that have diverged only in trifling details, I'm tempted to try to convince them to use repeaters instead. In fact I have wanted to extract their async iterator utility into its own package for a long time, but repeaters do pretty much the same thing and are more full-featured.

@brainkim
Copy link
Member

My main takeaway from this issue:
Combinators like map/filter can’t necessarily be easily expressed using async generators, because for await calls the next/return methods in sequence whereas you would want to allow next/return to be called in parallel. This is something I didn’t realize and for the longest time I simply insisted that you could implement map or filter with an async generator and for await (#26 (comment)). Maybe the actionable part of this issue would be to actually investigate creating a module for async iterator combinator functions. Maybe we could have them developed in user-space and decide later. Alternatively, I’m starting to think that return/throw should resume async generators when they’re suspended on await/for await again because it would make implementing these combinators so much easier...

Since graphql-subscriptions and graphql-redis-subscriptions both contain push/pull async iterator utility classes that have diverged only in trifling details, I'm tempted to try to convince them to use repeaters instead.

If you need any help doing this, feel free to ping me, email me or whatever I’d be happy to help. Part of me doesn’t want to shoulder the responsibility of trying to make graphql subscriptions more reliable or scalable but after looking through the codebases I think they could definitely benefit from using something more repeater-like. If you need help convincing people or need some changes to the repeater codebase, let me know.

@jedwards1211
Copy link
Contributor Author

Combinators like map/filter can’t necessarily be easily expressed using async generators

Well, it really depends on what you're dealing with, if your input async iterable is just the lines of a file then using an async generator for map or filter is really no big deal. Using async iterables for events throws a wrench in everything...

@yaacovCR
Copy link

Hi! Struggling to learn about async programming, async iterators, repeaters, alternatives, etc, as I try to implement @defer and @stream directives for graphql-tools. I am looking forward to using repeater-js to avoid rolling my own async iterators implementation.

I will have to merge async iterables, map them and filter them, etc, so a bunch of implemented functionality, functionality explained above, and possibly v4.0 functionality seems pretty appealing.

In consideration of taking the plunge with repeaters, I hit upon the following statement in the discussion above that seemed like an important point:

  1. All iterations settle in call order.

Maybe you might have time to be able to elaborate on how raw async iterators do not achieve this and repeaters do? And why this is desirable? Is this statement meant only with respect to next() and return() or even with different calls to next()

--Thanks in advance from an async noob

@yaacovCR
Copy link

Also -- i have to map, merge, filter, etc async iterables that may hang, i.e... I am planning on using the Repeater.race/timeout example to make sure that the for...await... loop does not hang forever.

Some additional questions about consumption of Repeaters:

(1) Is the for...await...loop "safe" to consume async iteables, bottom line, given all of the above? I think the answer to the "yes" or "yes, as long as you don't care that cleanup might happen after the next call to next".

(2) If I am using Repeater.race because I think the promise might never settle, does using Repeater.race mean I don't have to worry about memory issues related to hanging promises? I think the answer is "no" there may still be memory issues, but I am stumped what to do about that besides to complain to whomever passed me the subscriber function that THEY didn't implement a timeout....

@yaacovCR
Copy link

In terms of the things settling in call order, is that only in cases when you are awaiting the push as in:

await push(...);

So that you do not push an additional event until the first settles? But your guide mentions other patterns, of course, right?

@brainkim
Copy link
Member

brainkim commented Dec 18, 2020

@yaacovCR

Maybe you might have time to be able to elaborate on how raw async iterators do not achieve this and repeaters do? And why this is desirable? Is this statement meant only with respect to next() and return() or even with different calls to next()

By iteration I just mean any call to next(), return() and throw(). If the promises returned from these methods do not resolve in call order, this is surprising behavior which might cause race conditions or bugs. You might, for instance, have an chunking combinator function, which collects multiple values from an async iterator and puts them in an array. If the iterations of the iterator settled out order, you might have accidentally reordered the values of the iterator depending on how you code the chunking function. More commonly, you might have some logic which cleans up when an iteration has the done property set to true. If this iteration fulfills before earlier iterations, the cleanup code would happen before the other iterations come through, meaning those earlier iterations might be dropped, or the reactions to those earlier iterations might cause errors.

Most of the time, this stuff isn’t a problem, because in 90% of use-cases you wait for the current iteration before pulling the next. But in those situations where you pull iterations concurrently, you likely need all the help you can get, and having iterations settle in order is just one less thing you have to worry about. Additionally, it was a requirement for repeaters, because one of the key design decisions was that repeaters should be indistinguishable from async generator objects, and having calls settle out of order would be a minor disparity between repeaters and async generators.

Some additional questions about consumption of Repeaters:

(1) Is the for...await...loop "safe" to consume async iteables, bottom line, given all of the above? I think the answer to the "yes" or "yes, as long as you don't care that cleanup might happen after the next call to next".

(2) If I am using Repeater.race because I think the promise might never settle, does using Repeater.race mean I don't have to worry about memory issues related to hanging promises? I think the answer is "no" there may still be memory issues, but I am stumped what to do about that besides to complain to whomever passed me the subscriber function that THEY didn't implement a timeout....

Long running promises are not supposed to cause memory leaks, but in practice, they often do, because we only keep promises around to be awaited or queried. Every time you call then() on a long-running promise, you add another promise reaction to the promise, so if you call then() in an unbounded fashion, this becomes a memory leak as the retained promise retains an increasing number of promise reactions.

This problem is actually exacerbated by calling Promise.race(), because when using Promise.race() with a long-running promise, the long-running promise retains the resolved value of each race. If you want the nitty-gritty details, you can see this comment I wrote in this Node.js issue.

Repeater.race is safe to use with long-running promises, because it avoids calling Promise.race() entirely (#65). You should not experience memory leaks when using Repeater.race with long running promises, and racing a long-running promise with an async iterator is one of the main use-cases of this function. If you limit the number of times you react to a long-running promise, and never use Promise.race(), you’ll probably be in the clear.

In terms of the things settling in call order, is that only in cases when you are awaiting the push as in: await push(...); So that you do not push an additional event until the first settles? But your guide mentions other patterns, of course, right?

You do not need to await push calls for repeaters. All calls to next(), return() and throw() on repeaters settle in order anyways because each call is chained with the previous call in an infinite promise chain.

Feel free to open new issues or discussions if you need any help!

@brainkim
Copy link
Member

Closing this issue as the original question seems to have been resolved, and other stuff is tracked in other issues.

@yaacovCR
Copy link

Thanks so much, really helpful!

@yaacovCR
Copy link

Curious, is it possible to streamline a bit further?

More specifically, in the latest revision above, we are not returning the value of finalIteration, so do we really need to await it at all?

function map(iterable, fn) {
  const iter = iterable[Symbol.asyncIterator]();
  const returner = typeof iter.return === "function" ? iter.return : undefined;

  return new Repeater(async (push, stop) => {
    let finalIteration;
    stop.then(() => {
      finalIteration = returner ? returner() : true;
    });

    while (!finalIteration) {
      const iteration = await iter.next();
      if (iteration.done) {
        stop();
        return iteration.value;
      }
      await push(fn(iteration.value));
    }
    // what is the purpose of awaiting finalIteration prior to exiting the executor
    // perhaps to force the parent iterator to return prior to the child returning?
    // but would that matter?
    // await finalIteration;
  });
}

@brainkim
Copy link
Member

brainkim commented Jan 21, 2021

@yaacovCR

what is the purpose of awaiting finalIteration prior to exiting the executor

If the map iterator is returned prematurely, we need to call return on the source iterator. We may not use that final iteration value, but this final return call can take an indeterminate amount of time, and it can also throw an error. Not awaiting it would allow for potential unhandled promise rejections because you’re floating the finalIteration promise.

@yaacovCR
Copy link

Makes sense, thanks.

yaacovCR added a commit to ardatan/graphql-tools that referenced this issue May 11, 2021
 Implementation adapted from: repeaterjs/repeater#48 (comment) so that all payloads will be delivered in the original order
yaacovCR added a commit to ardatan/graphql-tools that referenced this issue May 11, 2021
 Implementation adapted from: repeaterjs/repeater#48 (comment) so that all payloads will be delivered in the original order
yaacovCR added a commit to ardatan/graphql-tools that referenced this issue May 16, 2021
 Implementation adapted from: repeaterjs/repeater#48 (comment) so that all payloads will be delivered in the original order
yaacovCR added a commit to ardatan/graphql-tools that referenced this issue May 20, 2021
 Implementation adapted from: repeaterjs/repeater#48 (comment) so that all payloads will be delivered in the original order
yaacovCR added a commit to ardatan/graphql-tools that referenced this issue May 29, 2021
 Implementation adapted from: repeaterjs/repeater#48 (comment) so that all payloads will be delivered in the original order
yaacovCR added a commit to ardatan/graphql-tools that referenced this issue Jun 1, 2021
@yaacovCR
Copy link

yaacovCR commented Nov 5, 2021

Im having problems using the transducer above when it is thrown, I think because the awaited push rejection is stuck by the subsequent next of the underlying iterator

@brainkim
Copy link
Member

brainkim commented Nov 5, 2021

@yaacovCR Can you open another issue? Sounds interesting!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements to docs are requested question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants