Skip to content

[DO NOT LAND] Regress the token-stream-stress benchmark. #67248

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 9 additions & 30 deletions src/libsyntax/tokenstream.rs
Original file line number Diff line number Diff line change
Expand Up @@ -238,39 +238,18 @@ impl TokenStream {
0 => TokenStream::default(),
1 => streams.pop().unwrap(),
_ => {
// We are going to extend the first stream in `streams` with
// the elements from the subsequent streams. This requires
// using `make_mut()` on the first stream, and in practice this
// doesn't cause cloning 99.9% of the time.
//
// One very common use case is when `streams` has two elements,
// where the first stream has any number of elements within
// (often 1, but sometimes many more) and the second stream has
// a single element within.

// Determine how much the first stream will be extended.
// Needed to avoid quadratic blow up from on-the-fly
// reallocations (#57735).
let num_appends = streams.iter()
.skip(1)
.map(|ts| ts.len())
// rust-lang/rust#57735: pre-allocate vector to avoid
// quadratic blow-up due to on-the-fly reallocations.
let tree_count = streams.iter()
.map(|ts| ts.0.len())
.sum();

// Get the first stream. If it's `None`, create an empty
// stream.
let mut iter = streams.drain(..);
let mut first_stream_lrc = iter.next().unwrap().0;

// Append the elements to the first stream, after reserving
// space for them.
let first_vec_mut = Lrc::make_mut(&mut first_stream_lrc);
first_vec_mut.reserve(num_appends);
for stream in iter {
first_vec_mut.extend(stream.0.iter().cloned());
}
let mut vec = Vec::with_capacity(tree_count);

// Create the final `TokenStream`.
TokenStream(first_stream_lrc)
for stream in streams {
vec.extend(stream.0.iter().cloned());
}
TokenStream::new(vec)
}
}
}
Expand Down