Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

proposal: x/exp/xiter: new package with iterator adapters #61898

Open
rsc opened this issue Aug 9, 2023 · 207 comments
Open

proposal: x/exp/xiter: new package with iterator adapters #61898

rsc opened this issue Aug 9, 2023 · 207 comments
Labels
Milestone

Comments

@rsc
Copy link
Contributor

rsc commented Aug 9, 2023

We propose to add a new package golang.org/x/exp/xiter that defines adapters on iterators. Perhaps these would one day be moved to the iter package or perhaps not. There are concerns about how these would affect idiomatic Go code. It seems worth defining them in x/exp to help that discussion along, and then we can decide whether they move anywhere else when we have more experience with them.

The package is called xiter to avoid a collision with the standard library iter (see proposal #61897). An alternative would be to have xiter define wrappers and type aliases for all the functions and types in the standard iter package, but the type aliases would depend on #46477, which is not yet implemented.

This is one of a collection of proposals updating the standard library for the new 'range over function' feature (#61405). It would only be accepted if that proposal is accepted. See #61897 for a list of related proposals.

Edit, 2024-05-15: Added some missing 2s in function names, and also changed Reduce to take the function first, instead of between sum and seq.

Edit, 2024-07-17: Updated code to match the final Go 1.23 language change. Corrected various typos.


/*
Package xiter implements basic adapters for composing iterator sequences:

  • [Concat] and [Concat2] concatenate sequences.
  • [Equal], [Equal2], [EqualFunc], and [EqualFunc2] check whether two sequences contain equal values.
  • [Filter] and [Filter2] filter a sequence according to a function f.
  • [Limit] and [Limit2] truncate a sequence after n items.
  • [Map] and [Map2] apply a function f to a sequence.
  • [Merge], [Merge2], [MergeFunc], and [MergeFunc2] merge two ordered sequences.
  • [Reduce] and [Reduce2] combine the values in a sequence.
  • [Zip] and [Zip2] iterate over two sequences in parallel.

*/

package xiter

import (
	"cmp"
	"iter"
)

// Concat returns an iterator over the concatenation of the sequences.
func Concat[V any](seqs ...iter.Seq[V]) iter.Seq[V] {
	return func(yield func(V) bool) {
		for _, seq := range seqs {
			for e := range seq {
				if !yield(e) {
					return
				}
			}
		}
	}
}

// Concat2 returns an iterator over the concatenation of the sequences.
func Concat2[K, V any](seqs ...iter.Seq2[K, V]) iter.Seq2[K, V] {
	return func(yield func(K, V) bool) {
		for _, seq := range seqs {
			for k, v := range seq {
				if !yield(k, v) {
					return
				}
			}
		}
	}
}

// Equal reports whether the two sequences are equal.
func Equal[V comparable](x, y iter.Seq[V]) bool {
	for z := range Zip(x, y) {
		if z.Ok1 != z.Ok2 || z.V1 != z.V2 {
			return false
		}
	}
	return true
}

// Equal2 reports whether the two sequences are equal.
func Equal2[K, V comparable](x, y iter.Seq2[K, V]) bool {
	for z := range Zip2(x, y) {
		if z.Ok1 != z.Ok2 || z.K1 != z.K2 || z.V1 != z.V2 {
			return false
		}
	}
	return true
}

// EqualFunc reports whether the two sequences are equal according to the function f.
func EqualFunc[V1, V2 any](x iter.Seq[V1], y iter.Seq[V2], f func(V1, V2) bool) bool {
	for z := range Zip(x, y) {
		if z.Ok1 != z.Ok2 || !f(z.V1, z.V2) {
			return false
		}
	}
	return true
}

// EqualFunc2 reports whether the two sequences are equal according to the function f.
func EqualFunc2[K1, V1, K2, V2 any](x iter.Seq2[K1, V1], y iter.Seq2[K2, V2], f func(K1, V1, K2, V2) bool) bool {
	for z := range Zip2(x, y) {
		if z.Ok1 != z.Ok2 || !f(z.K1, z.V1, z.K2, z.V2) {
			return false
		}
	}
	return true
}

// Filter returns an iterator over seq that only includes
// the values v for which f(v) is true.
func Filter[V any](f func(V) bool, seq iter.Seq[V]) iter.Seq[V] {
	return func(yield func(V) bool) {
		for v := range seq {
			if f(v) && !yield(v) {
				return
			}
		}
	}
}

// Filter2 returns an iterator over seq that only includes
// the pairs k, v for which f(k, v) is true.
func Filter2[K, V any](f func(K, V) bool, seq iter.Seq2[K, V]) iter.Seq2[K, V] {
	return func(yield func(K, V) bool) {
		for k, v := range seq {
			if f(k, v) && !yield(k, v) {
				return
			}
		}
	}
}

// Limit returns an iterator over seq that stops after n values.
func Limit[V any](seq iter.Seq[V], n int) iter.Seq[V] {
	return func(yield func(V) bool) {
		if n <= 0 {
			return
		}
		for v := range seq {
			if !yield(v) {
				return
			}
			if n--; n <= 0 {
				break
			}
		}
	}
}

// Limit2 returns an iterator over seq that stops after n key-value pairs.
func Limit2[K, V any](seq iter.Seq2[K, V], n int) iter.Seq2[K, V] {
	return func(yield func(K, V) bool) {
		if n <= 0 {
			return
		}
		for k, v := range seq {
			if !yield(k, v) {
				return
			}
			if n--; n <= 0 {
				break
			}
		}
	}
}

// Map returns an iterator over f applied to seq.
func Map[In, Out any](f func(In) Out, seq iter.Seq[In]) iter.Seq[Out] {
	return func(yield func(Out) bool) {
		for in := range seq {
			if !yield(f(in)) {
				return
			}
		}
	}
}

// Map2 returns an iterator over f applied to seq.
func Map2[KIn, VIn, KOut, VOut any](f func(KIn, VIn) (KOut, VOut), seq iter.Seq2[KIn, VIn]) iter.Seq2[KOut, VOut] {
	return func(yield func(KOut, VOut) bool) {
		for k, v := range seq {
			if !yield(f(k, v)) {
				return
			}
		}
	}
}

// Merge merges two sequences of ordered values.
// Values appear in the output once for each time they appear in x
// and once for each time they appear in y.
// If the two input sequences are not ordered,
// the output sequence will not be ordered,
// but it will still contain every value from x and y exactly once.
//
// Merge is equivalent to calling MergeFunc with cmp.Compare[V]
// as the ordering function.
func Merge[V cmp.Ordered](x, y iter.Seq[V]) iter.Seq[V] {
	return MergeFunc(x, y, cmp.Compare[V])
}

// MergeFunc merges two sequences of values ordered by the function f.
// Values appear in the output once for each time they appear in x
// and once for each time they appear in y.
// When equal values appear in both sequences,
// the output contains the values from x before the values from y.
// If the two input sequences are not ordered by f,
// the output sequence will not be ordered by f,
// but it will still contain every value from x and y exactly once.
func MergeFunc[V any](x, y iter.Seq[V], f func(V, V) int) iter.Seq[V] {
	return func(yield func(V) bool) {
		next, stop := iter.Pull(y)
		defer stop()
		v2, ok2 := next()
		for v1 := range x {
			for ok2 && f(v1, v2) > 0 {
				if !yield(v2) {
					return
				}
				v2, ok2 = next()
			}
			if !yield(v1) {
				return
			}
		}
		for ok2 {
			if !yield(v2) {
				return
			}
			v2, ok2 = next()
		}
	}
}

// Merge2 merges two sequences of key-value pairs ordered by their keys.
// Pairs appear in the output once for each time they appear in x
// and once for each time they appear in y.
// If the two input sequences are not ordered by their keys,
// the output sequence will not be ordered by its keys,
// but it will still contain every pair from x and y exactly once.
//
// Merge2 is equivalent to calling MergeFunc2 with cmp.Compare[K]
// as the ordering function.
func Merge2[K cmp.Ordered, V any](x, y iter.Seq2[K, V]) iter.Seq2[K, V] {
	return MergeFunc2(x, y, cmp.Compare[K])
}

// MergeFunc2 merges two sequences of key-value pairs ordered by the function f.
// Pairs appear in the output once for each time they appear in x
// and once for each time they appear in y.
// When pairs with equal keys appear in both sequences,
// the output contains the pairs from x before the pairs from y.
// If the two input sequences are not ordered by f,
// the output sequence will not be ordered by f,
// but it will still contain every pair from x and y exactly once.
func MergeFunc2[K, V any](x, y iter.Seq2[K, V], f func(K, K) int) iter.Seq2[K, V] {
	return func(yield func(K, V) bool) {
		next, stop := iter.Pull2(y)
		defer stop()
		k2, v2, ok2 := next()
		for k1, v1 := range x {
			for ok2 && f(k1, k2) > 0 {
				if !yield(k2, v2) {
					return
				}
				k2, v2, ok2 = next()
			}
			if !yield(k1, v1) {
				return
			}
		}
		for ok2 {
			if !yield(k2, v2) {
				return
			}
			k2, v2, ok2 = next()
		}
	}
}

// Reduce combines the values in seq using f.
// For each value v in seq, it updates sum = f(sum, v)
// and then returns the final sum.
// For example, if iterating over seq yields v1, v2, v3,
// Reduce returns f(f(f(sum, v1), v2), v3).
func Reduce[Sum, V any](f func(Sum, V) Sum, sum Sum, seq iter.Seq[V]) Sum {
	for v := range seq {
		sum = f(sum, v)
	}
	return sum
}

// Reduce2 combines the values in seq using f.
// For each pair k, v in seq, it updates sum = f(sum, k, v)
// and then returns the final sum.
// For example, if iterating over seq yields (k1, v1), (k2, v2), (k3, v3)
// Reduce returns f(f(f(sum, k1, v1), k2, v2), k3, v3).
func Reduce2[Sum, K, V any](f func(Sum, K, V) Sum, sum Sum, seq iter.Seq2[K, V]) Sum {
	for k, v := range seq {
		sum = f(sum, k, v)
	}
	return sum
}

// A Zipped is a pair of zipped values, one of which may be missing,
// drawn from two different sequences.
type Zipped[V1, V2 any] struct {
	V1  V1
	Ok1 bool // whether V1 is present (if not, it will be zero)
	V2  V2
	Ok2 bool // whether V2 is present (if not, it will be zero)
}

// Zip returns an iterator that iterates x and y in parallel,
// yielding Zipped values of successive elements of x and y.
// If one sequence ends before the other, the iteration continues
// with Zipped values in which either Ok1 or Ok2 is false,
// depending on which sequence ended first.
//
// Zip is a useful building block for adapters that process
// pairs of sequences. For example, Equal can be defined as:
//
//	func Equal[V comparable](x, y iter.Seq[V]) bool {
//		for z := range Zip(x, y) {
//			if z.Ok1 != z.Ok2 || z.V1 != z.V2 {
//				return false
//			}
//		}
//		return true
//	}
func Zip[V1, V2 any](x iter.Seq[V1], y iter.Seq[V2]) iter.Seq[Zipped[V1, V2]] {
	return func(yield func(z Zipped[V1, V2]) bool) {
		next, stop := iter.Pull(y)
		defer stop()
		v2, ok2 := next()
		for v1 := range x {
			if !yield(Zipped[V1, V2]{v1, true, v2, ok2}) {
				return
			}
			v2, ok2 = next()
		}
		var zv1 V1
		for ok2 {
			if !yield(Zipped[V1, V2]{zv1, false, v2, ok2}) {
				return
			}
			v2, ok2 = next()
		}
	}
}

// A Zipped2 is a pair of zipped key-value pairs,
// one of which may be missing, drawn from two different sequences.
type Zipped2[K1, V1, K2, V2 any] struct {
	K1  K1
	V1  V1
	Ok1 bool // whether K1, V1 are present (if not, they will be zero)
	K2  K2
	V2  V2
	Ok2 bool // whether K2, V2 are present (if not, they will be zero)
}

// Zip2 returns an iterator that iterates x and y in parallel,
// yielding Zipped2 values of successive elements of x and y.
// If one sequence ends before the other, the iteration continues
// with Zipped2 values in which either Ok1 or Ok2 is false,
// depending on which sequence ended first.
//
// Zip2 is a useful building block for adapters that process
// pairs of sequences. For example, Equal2 can be defined as:
//
//	func Equal2[K, V comparable](x, y iter.Seq2[K, V]) bool {
//		for z := range Zip2(x, y) {
//			if z.Ok1 != z.Ok2 || z.K1 != z.K2 || z.V1 != z.V2 {
//				return false
//			}
//		}
//		return true
//	}
func Zip2[K1, V1, K2, V2 any](x iter.Seq2[K1, V1], y iter.Seq2[K2, V2]) iter.Seq[Zipped2[K1, V1, K2, V2]] {
	return func(yield func(z Zipped2[K1, V1, K2, V2]) bool) {
		next, stop := iter.Pull2(y)
		defer stop()
		k2, v2, ok2 := next()
		for k1, v1 := range x {
			if !yield(Zipped2[K1, V1, K2, V2]{k1, v1, true, k2, v2, ok2}) {
				return
			}
			k2, v2, ok2 = next()
		}
		var zk1 K1
		var zv1 V1
		for ok2 {
			if !yield(Zipped2[K1, V1, K2, V2]{zk1, zv1, false, k2, v2, ok2}) {
				return
			}
			k2, v2, ok2 = next()
		}
	}
}
@rsc rsc added the Proposal label Aug 9, 2023
@gopherbot gopherbot added this to the Proposal milestone Aug 9, 2023
@gophun
Copy link

gophun commented Aug 9, 2023

The duplication of each function is the first thing that catches the eye. Are there thoughts on why this is acceptable?

@gophun
Copy link

gophun commented Aug 9, 2023

What about an adapter that converts an iter.Seq[V] to an iter.Seq2[int, V] and an adapter that converts an iter.Seq2[K, V] to an iter.Seq[V]?

@zephyrtronium
Copy link
Contributor

Some typos: EqualFunc2, Map2, Merge2, and MergeFunc2 lack the 2 suffixes on their actual names. They're all correct in the corresponding documentation.

@earthboundkid
Copy link
Contributor

May I humbly suggest that the name "iterutils" is less susceptible to, uh, unfortunate mispronunciation.

@earthboundkid
Copy link
Contributor

For Reduce, the callback should go last: func Reduce[Sum, V any](sum Sum, seq Seq[V], f func(Sum, V) Sum) Sum.

@DeedleFake
Copy link

DeedleFake commented Aug 9, 2023

For Reduce, the callback should go last: func Reduce[Sum, V any](sum Sum, seq Seq[V], f func(Sum, V) Sum) Sum.

I'd actually prefer func Reduce[Sum, V any](seq Seq[V], sum Sum, f func(Sum, V) Sum) Sum.

Edit: I just realized that if Reduce() is being used to build an array, putting sum first puts everything in the same order as Append() and other functions that put the destination first. I'm not sure if that's worth it or not.

@rsc
Copy link
Contributor Author

rsc commented Aug 9, 2023

This proposal has been added to the active column of the proposals project
and will now be reviewed at the weekly proposal review meetings.
— rsc for the proposal review group

@DeedleFake
Copy link

The more I think about it, the more that I think that API design for this should wait until after a decision is made on #49085. Multiple other languages have proven over and over that a left-to-right chained syntax is vastly superior ergonomically to simple top-level functions for iterators. For example, compare

nonNegative := xiter.Filter(
  xiter.Map(
    bufio.Lines(r),
    parseLine,
  ),
  func(v int) bool { return v >= 0 },
)

vs.

nonNegative := bufio.Lines(r).
  Map(parseLine).
  Filter(func(v int) bool { return v >= 0 })

Go's a little weird because of the need to put the .on the previous line, but other than that one oddity, which I could get used to, the second is better in every way. It reads in the order that actions actually happen, it's less repetitive, etc. The only real way to emulate it currently is something like

lines := bufio.Lines(r)
intlines := xiter.Map(lines, parseLine)
nonNegative := xiter.Filter(func(v int) bool { return v >= 0 })

That works, but it clutters up the local namespace and it's significantly harder to edit. For example, if you decide you need to add a new step in the chain, you have to make sure that all of the variables for each iterator match up in the previous and succeeding calls.

@ianlancetaylor
Copy link
Contributor

What type does bufio.Lines return to make that work in Go? What methods does that type support? What is the type of nonNegative? I mean these as honest questions. Can we write this kind of code in Go today, or would we need new language features?

@hherman1
Copy link

You would probably have to wrap the base iterator like:

stream.New(bufio.Lines).
    Filter(…).
    …

@DeedleFake
Copy link

DeedleFake commented Aug 10, 2023

@ianlancetaylor

Sorry. I should have stuck a comment in. I was just coming up with some hypothetical function that would give an iter.Seq[string]. In this case, the idea was that it would internally use a bufio.Scanner to yield lines from an io.Reader or something. My original code had an anonymous func(string) int instead of the vague parseLine but I removed it because it was clogging up the example with irrelevant code and I didn't clarify when I did.

@hherman1

Not necessarily. The transformative and sink functions on iterators could just be defined as methods on iter.Seq.

@hherman1
Copy link

hherman1 commented Aug 10, 2023

But iter.Seq is an interface type no? Are you saying it should be a struct type?

I was wrong, it’s not an interface.

@benhoyt
Copy link
Contributor

benhoyt commented Aug 10, 2023

Why do some functions take the f func as the last parameter, but Filter and Map take it as the first, and Reduce in the middle? Most other functions in the stdlib take funcs as the last parameter, such as sort.Slice, slices.*Func, ServeMux.HandleFunc, and so on. This makes code that uses them with inline function literals more readable:

names := xiter.Map(func (p Person) string {
	return p.Name
}, people) // "people" gets lost

// vs

names := xiter.Map(people, func (p Person) string {
	return p.Name
})

@Merovius
Copy link
Contributor

Merovius commented Aug 10, 2023

@DeedleFake There won't be a "decision" on #49085 anytime soon. There are good reasons not to do it yet, but we also don't want to say it never happens. The issue exists to reflect that state. What it comes down to is, would you rather have no iterators (for the foreseeable future) or ones which can't be "chained"?

@DeedleFake
Copy link

DeedleFake commented Aug 10, 2023

What it comes down to is, would you rather have no iterators (for the foreseeable future) or ones which can't be "chained"?

No iterators, definitely. I've done fine without them for over a decade. I can wait a bit longer. If a bad implementation goes in, I'll never get a good version. Plus, I can just write my own implementation of whatever iterator functions I need as long as range-over-func exists while I wait.

@gophun
Copy link

gophun commented Aug 10, 2023

Neither chaining nor functional programming has ever been a decisive or recommended technique in Go. Instead, iteration—specifically, procedural 'for' loops—has always been a core technique since the language's inception. The iterator proposals aim to enhance this core approach. While I don't know what the overall plans are, if you're hoping for Go to follow the path of Java Streams or C# LINQ, you might be in for disappointment.

@Merovius
Copy link
Contributor

I can wait a bit longer. If a bad implementation goes in, I'll never get a good version.

I think "a bit" is misleading. We are talking years - if at all. And I don't believe the second part of that sentence is true either, we could always release a v2 of the relevant packages, if we ever manage to do #49085 in a decade or so.

@DeedleFake
Copy link

DeedleFake commented Aug 10, 2023

While I don't know what the overall plans are, if you're hoping for Go to follow the path of Java Streams or C# LINQ, you might be in for disappointment.

Is that not the intention of these proposals? To build a standardized iterator system that works similarly to those? Why else is there a proposal here for Map(), Filter(), and Reduce(), among others? I have no problem with functions like slices.Backwards() and other source function proposals. My only problem is the transformative and sink functions.

I think "a bit" is misleading. We are talking years - if at all. And I don't believe the second part of that sentence is true either, we could always release a v2 of the relevant packages, if we ever manage to do #49085 in a decade or so.

Edit: The way this proposal is phrased does actually imply that they may be heavily reevaluated enough in x/exp that they may not go into the standard library at all, so maybe my point is moot after all. I still think that this is a valid issue with the API design to bring up, but maybe it's a bit off-topic for this particular proposal and should wait until after they're in x/exp and it can be more easily demonstrated how awkward they are to use. I don't like the idea that existing code will be broken when some variant of them does potentially go into the standard library, but it's less of a problem than I was worried about. Never mind. Please ignore my rambling below.

That issue has only been open for 2 years. I think assuming that it'll take a decade to solve is a bit unfair. Yes, a v2 is an option, especially if #61716 is accepted, but that was created out of necessity to deal with problems in an existing package, while this would essentially be purposefully putting problems into a new package. It's not like I'm saying that iterators are unacceptable to me in this state, just that features have been delayed or cut before because of possible changes coming later and that I think that it's prudent to discuss the possibility here. That just happened in the last few weeks in the maps package because of the possibility of the acceptance of #61405. I think the same should be done with the transformative and sink functions for now, or at the very least those functions should be planned to stay in x/exp until some way to clean up the API is decided on, that's all.

One of my favorite things about Go is how slow and methodical it (usually) is in introducing new features. I think that the fact that it took over a decade to add generics is a good thing, and I really wanted generics. One of the purposes of that approach is to try avoid having to fix it later. Adding those functions in the proposed manner will almost definitely necessitate that later fix, and I very much would like to avoid that if at all possible.

@gophun
Copy link

gophun commented Aug 10, 2023

Is that not the intention of these proposals? To build a standardized iterator system that works similarly to those?

Java Streams and .NET LINQ build on a standardized iterator system, but they are more than that. Both languages had a generic iterator system before. Iterators are useful without chaining or functional programming.

Why else is there a proposal here for Map(), Filter(), and Reduce(), among others?

That would be this very proposal, and it comes with a caveat: "... or perhaps not. There are concerns about how these would affect idiomatic Go code. "

This means that not everyone who has read these proposals in advance believes that this part is a good idea.

@jba
Copy link
Contributor

jba commented Aug 10, 2023

While I don't know what the overall plans are, if you're hoping for Go to follow the path of Java Streams or C# LINQ, you might be in for disappointment.

Is that not the intention of these proposals? To build a standardized iterator system that works similarly to those? Why else is there a proposal here for Map(), Filter(), and Reduce(), among others?

Maybe chaining leads to too much of a good thing. It becomes more tempting to write long, hard-to-read chains of functions. You're less likely to do that if you have to nest calls.

As an analogy, Go has if. Isn't the intention of if to allow conditional execution? Why then shouldn't Go have the ternary operator ?:? Because it often leads to hard-to-read code.

@rsc
Copy link
Contributor Author

rsc commented Aug 10, 2023

Re #49085, generic methods either require (A) dynamic code generation or (B) terrible speed or (C) hiding those methods from dynamic interface checks or (D) not doing them at all. We have chosen option (D). The issue remains open like so many suggestions people have made, but I don't see a realistic path forward where we choose A, B, or C, nor do I see a fifth option. So it makes sense to assume generic methods are not going to happen and do our work accordingly.

@Merovius
Copy link
Contributor

Merovius commented Aug 10, 2023

@DeedleFake The issue is not lack of understanding what a lack of parameterized methods means. It's just that, as @rsc said, wanting them doesn't make them feasible. The issue only being 2 years old is deceptive. The underlying problem is actually as old as Go and one of the main reasons we didn't have generics for most of that. Which you should consider, when you say

I think that the fact that it took over a decade to add generics is a good thing, and I really wanted generics.

We got generics by committing to keep implementation strategies open, thus avoiding the generics dilemma. Not having parametric methods is a pretty direct consequence of that decision.

@DeedleFake
Copy link

Well, I tried. If that's the decision then that's the decision. I'm disappointed, but I guess I'll just be satisfied with what I do like about the current proposal, even if it has, in my opinion, some fairly major problems. Sorry for dragging this a bit off-topic there.

@thediveo
Copy link

Hope that it's not noise: I wondered if naming it the sum parameter might be implying to the less experienced dev that reduce does only summation, so I looked at Javascript's array reduce: that uses accumulator. I don't know if that is much better, I just wanted to point it out. If anything, let's have a good laugh.

@jimmyfrasche
Copy link
Member

Those nonstandard Zip definitions look like they would occasionally be useful but I think I'd want the ordinary zip/zipLongest definitions most of the time. Those can be recovered from the proposed with some postprocessing but I'd hate to have to always do that.

These should be considered along with Limit:

LimitFunc - stop iterating after a predicate matches (often called TakeWhile in other languages)

Skip, SkipFunc - drop the first n items (or until the predicate matches) before yielding (opposite of Limit/LimitFunc, often called drop/dropWhile)

@jba
Copy link
Contributor

jba commented Aug 10, 2023

Those nonstandard Zip definitions look like they would occasionally be useful but I think I'd want the ordinary zip/zipLongest definitions most of the time.

Can you explain the difference? Is it just that zip typically stops at the end of the shorter sequence? That is definitely less useful as a building block, and easy to write given these functions. What are some examples where stopping at the shortest is better?

@jimmyfrasche
Copy link
Member

zip stops after the shorter sequence. zipLongest pads out the missing values of the shorter sequence with a specified value.

The provided ones are more general and can be used to build those but I can't really think of any time I've used zip where I needed to know that. I've always either known the lengths were equal by construction so it didn't matter or couldn't do anything other than drop the excess so it didn't matter. Maybe that's peculiar to me and the situations in which I reach for zip, but they've been defined like that in every language I can think I've used which has to be some kind of indicator that I'm not alone in this.

I'm not arguing for them to be replaced with the less general more common versions: I want those versions here too so I can use them directly without having to write a shim to the standard definition.

@jub0bs
Copy link

jub0bs commented Sep 17, 2024

@jimmyfrasche Maybe I'm missing something, but you're describing is a single-use iterator, which doesn't seem specific to the functions I suggest. Can you link to a Playground that illustrates your point? Excellent point!

@bobg

Head unrecoverably discards the rest of the iterator

I don't perceive this as problematic. For a single-use iterator, restarting the loop would yield the second element and so on; for a multiple-use iterator, the loop would yield the first one again. That wouldn't be too bad if the caller only cared about the head and if my implementation didn't leak the pull-based iterator.

Tail requires the caller to use the returned iterator, otherwise the defer stop() never gets called and resources can leak from the input iterator.

Very good point, which I had missed. I'll have to think about this a bit more, it seems.

@Merovius
Copy link
Contributor

Merovius commented Sep 17, 2024

My take (playground):

func Cut[E any](s iter.Seq[E]) (head E, tail iter.Seq[E], ok bool) {
	for v := range s {
		head, ok = v, true
		break
	}
	tail = func(yield func(E) bool) {
		if !ok {
			return
		}
		first := true
		for v := range s {
			if first {
				first = false
				continue
			}
			if !yield(v) {
				return
			}
		}
	}
	return head, tail, ok
}

Though, for the record, I'm against including something like this, for reasons already mentioned by others.

@jub0bs
Copy link

jub0bs commented Sep 17, 2024

@DeedleFake @Merovius After thinking about this a bit more, I have to agree: I can't think of a way for functions like Head, Tail, Uncons to work with single-use iterators. In that case, leaving them out is preferable. The desired "logic" (treating the first element, if any, in a special way) is easy enough to implement in the loop of a push-based iterator.

@jimmyfrasche
Copy link
Member

Every case I've had for Head has been to simplify pattern I kept coming across in higher order iterators:

first, once := true, false
for v := range seq {
  if first {
    first = false
   // prime the pump
  } else {
     once = true
    // actual loop code
  }
}
if first && !once {
  // special case for one value seq
}

In terms of this thread, though, I only mentioned Head as another thing that could be implemented with Push.

@jub0bs
Copy link

jub0bs commented Sep 19, 2024

The more I ponder the design of a library that would complement iter, the more I'm wary of sink functions that may never terminate, such as the first four functions suggested in #61898 (comment). After all, just like a caller cannot know a priori what values a context.Context contains, the caller cannot know a priori whether a given iter.Seq[E] or iter.Seq2[K,V] is finite.

For example, the program below never terminates:

package main

import (
	"fmt"
	"iter"
)

func main() {
	fmt.Println(Count(someFunction()))
}

func Count[E any](seq iter.Seq[E]) int {
	var n int
	for range seq {
		n++
	}
	return n
}

func someFunction() iter.Seq[string] {
	return Repeat("foo")
}

func Repeat[E any](e E) iter.Seq[E] {
	return func(yield func(E) bool) {
		for yield(e) {
			// deliberately empty body
		}
	}
}

(Playground)

IMO, sink functions that may not terminate are simply too easy to misuse. As such, they're better left out. Or alternatives guaranteed to terminate like that suggested in #61898 (comment) should be provided instead.

@Merovius
Copy link
Contributor

@jub0bs I'll note that you don't know a priori whether an io.Reader ever returns io.EOF, yet we are fine with io.Copy etc. existing.

@jub0bs
Copy link

jub0bs commented Sep 19, 2024

@Merovius Good point, but I suspect that infinite iterators will be more common than io.Readers that never return io.EOF. For example, I don't think my Repeat function or the following function are particularly farfetched:

func Iterate[E any](e E, f func(E) E) iter.Seq[E] {
	return func(yield func(E) bool) {
		for yield(e) {
			e = f(e)
		}
	}
}

@jub0bs
Copy link

jub0bs commented Sep 19, 2024

Or, if such sinks are included in x/exp/xiter, their documentation should clearly state that the caller is responsible for passing them only finite iterators.

@adonovan
Copy link
Member

I'm not sure I see the danger with Repeat or Iterate; both seem like elegant solutions to some problems. As @Merovius said, just as with io.Readers, some iterators are finite, some infinite. Just because the Sum or Count of an infinite sequence is bottom doesn't mean there is a problem with Sum or Count, or with your infinite sequence.

@jub0bs
Copy link

jub0bs commented Sep 19, 2024

@adonovan I do like all those functions. I just think that people not coming from an FP background should be warned in the documentation that some sinks expect a finite iterator.

@DeedleFake
Copy link

Another function proposal:

func Drain[T any](seq iter.Seq[T]) (last T, ok bool) {
  for v := range seq {
    last = v
    ok = true
  }
  return last, ok
}

Last() could also work as a name.

This may have been proposed further up, and if so sorry for repeating it. I scanned through but didn't see anything obvious.

jub0bs added a commit to jub0bs/iterutil that referenced this issue Sep 21, 2024
Those functions would leak a pull-based iterator in some cases;
and even if the bug in question were fixed,
those functions couldn't work with single-use iterators;
see golang/go#61898 (comment) and
follow-up comments.
@eihigh
Copy link

eihigh commented Oct 5, 2024

In my opinion, a Push converter that pairs with iter.Pull would be useful. Push would allow values to be pushed gradually to functions that take an iterator as an argument.

func Push[V any](recv func(iter.Seq[V])) (push func(V) bool) {
	var in V

	coro := func(yieldCoro func(struct{}) bool) {
		seq := func(yieldSeq func(V) bool) {
			for yieldSeq(in) {
				yieldCoro(struct{}{})
			}
		}
		recv(seq)
	}

	next, stop := iter.Pull(coro)
	return func(v V) bool {
		in = v
		_, more := next()
		if !more {
			stop()
			return false
		}
		return true
	}
}
func main() {
	sum := func(src iter.Seq[int]) {
		sum := 0
		for v := range src {
			sum += v
			fmt.Printf("- sum: %d\n", sum)
		}
	}
	pushSum := Push(sum)

	for {
		var n int
		fmt.Scanln(&n)
		if !pushSum(n) {
			break
		}
	}
}
$ go run .
1
- sum: 1
2
- sum: 3
3
- sum: 6

This is necessary for handling endless data streams, such as real-time metrics or event sequences, as iterators. Additionally, while iter.Pull allowed for integrating multiple iterators like Zip and Merge, conversely, distributing from a single iterator to multiple iterators is difficult without Push.

func main() {
	fizz := func(src iter.Seq[int]) {
		for v := range src {
			if v%3 == 0 {
				fmt.Println("- fizz")
			}
		}
	}
	pushFizz := Push(fizz)

	buzz := func(src iter.Seq[int]) {
		for v := range src {
			if v%5 == 0 {
				fmt.Println("- buzz")
			}
		}
	}
	pushBuzz := Push(buzz)

	sum := func(src iter.Seq[int]) {
		sum := 0
		for v := range src {
			sum += v
			fmt.Printf("- sum: %d\n", sum)
			if !pushFizz(sum) || !pushBuzz(sum) {
				return
			}
		}
	}
	pushSum := Push(sum)

	for {
		var n int
		fmt.Scanln(&n)
		if !pushSum(n) {
			break
		}
	}
}
$ go run .
1
- sum: 1
2
- sum: 3
- fizz
3
- sum: 6
- fizz
4
- sum: 10
- buzz
5
- sum: 15
- fizz
- buzz
6
- sum: 21
- fizz

@DmitriyMV
Copy link
Contributor

Another function proposal:

func Drain[T any](seq iter.Seq[T]) (last T, ok bool) {
  for v := range seq {
    last = v
    ok = true
  }
  return last, ok
}

Last() could also work as a name.

This may have been proposed further up, and if so sorry for repeating it. I scanned through but didn't see anything obvious.

This will not work.

@adonovan
Copy link
Member

adonovan commented Oct 24, 2024

This will not work.

That depends how you define it: it correctly returns the last element, which means it must have iterated over the whole sequence. If the Seq is a one-shot, then it will have been drained. "Last" seems like a clearer name for this operation because that's what it guarantees; "Drain" only makes sense for one-shot sequences (which, one hopes, are far from the norm).

In any case I'm not convinced that this is something we should put in the library. Users might easily think that Last on a Seq whose representation supports random access, such as a slice, would be O(1), but in fact it is not, and indeed cannot be, efficient, because there is no way to ask a Seq if it supports a narrower random-access interface.

@dfreese
Copy link

dfreese commented Oct 29, 2024

I was experimenting with this API and one thing I ran into was that there is not a way to turn iter.Seq[T] into iter.Seq2[T] and vice versa. There are Zip and Zip2, however, those don't compose well with other iterator functions, as often an iterator is described once as the result of other functions.

The use case I ran into was building up a map[string]myStruct, where the key was an identifier pulled from the struct that would be used for lookup.

func Split[In, KOut, VOut any](f func(In) (KOut, VOut), seq iter.Seq[In]) iter.Seq2[KOut, VOut] {
	return func(yield func(KOut, VOut) bool) {
		for in := range seq {
			if !yield(f(in)) {
				return
			}
		}
	}
}

func Combine[KIn, VIn, Out any](f func(KIn, VIn) Out, seq iter.Seq2[KIn, VIn]) iter.Seq[Out] {
	return func(yield func(KIn, VIn) bool) {
		for k, v := range seq {
			if !yield(f(k, v)) {
				return
			}
		}
	}
}

That would allow things such as:

type data struct {
	id string
}

func foo(datas []*data) map[string]*data {
	return maps.Collect(xiter.Split(func (d *data) {
		return d.id, d
	}, slices.Values(datas)))
}

Edit: I see now that this has been discussed at length, sorry for the noise.

@DeedleFake
Copy link

DeedleFake commented Oct 30, 2024

More and more I see examples like that that make me want both some variant of #21498 and a pipe operator:

return slices.Values(datas) |>
  xiter.Split(func { d -> d.id, d }) |>
  maps.Collect()

I feel like most of the iterator adapters will have very limited usefulness without especially #21498.

@DmitriyMV
Copy link
Contributor

@DeedleFake I doubt this is ever going to happen tho, since inferring the number arguments is now dependent of what is before the call. It hurts readability for those unfamiliar with languages like Elm, Haskell and so on.

@jba
Copy link
Contributor

jba commented Oct 31, 2024

#70140 erroneously duplicates Merge but adds Intersect, Union and Subtract, which are only meaningful on sorted sequences. It's unclear whether they should be in a separate package.

DmitriyMV added a commit to DmitriyMV/gen that referenced this issue Oct 31, 2024
This is breaking change PR:
- Reorder `func` in xiter package, so that it's always first. That helps with several transformations in one place. See [this](golang/go#61898 (comment)).
- Add `Single` and `Single2` iterators.
- Add `Find` and `Find2` iterators.
- Rename `Fold` to `Reduce`.
- Other small changes.

Signed-off-by: Dmitriy Matrenichev <dmitry.matrenichev@siderolabs.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: Active
Development

No branches or pull requests