Skip to content

Commit 2d5843d

Browse files
committed
[DO NOT LAND] Regress the token-stream-stress benchmark.
I have a suspicion that there is a bug in rustc-perf or rust-timer causing the wrong revisions to be measured by CI. See #66405 and #67079 for more details. This commit deliberately causes a massive regression to the `token-stream-stress` benchmark. On my machine, the instruction count goes from 313M to 6084M, an 1843.4% regression. I want to see if a CI run replicates that.
1 parent 7dbfb0a commit 2d5843d

File tree

1 file changed

+9
-30
lines changed

1 file changed

+9
-30
lines changed

src/libsyntax/tokenstream.rs

+9-30
Original file line numberDiff line numberDiff line change
@@ -238,39 +238,18 @@ impl TokenStream {
238238
0 => TokenStream::default(),
239239
1 => streams.pop().unwrap(),
240240
_ => {
241-
// We are going to extend the first stream in `streams` with
242-
// the elements from the subsequent streams. This requires
243-
// using `make_mut()` on the first stream, and in practice this
244-
// doesn't cause cloning 99.9% of the time.
245-
//
246-
// One very common use case is when `streams` has two elements,
247-
// where the first stream has any number of elements within
248-
// (often 1, but sometimes many more) and the second stream has
249-
// a single element within.
250-
251-
// Determine how much the first stream will be extended.
252-
// Needed to avoid quadratic blow up from on-the-fly
253-
// reallocations (#57735).
254-
let num_appends = streams.iter()
255-
.skip(1)
256-
.map(|ts| ts.len())
241+
// rust-lang/rust#57735: pre-allocate vector to avoid
242+
// quadratic blow-up due to on-the-fly reallocations.
243+
let tree_count = streams.iter()
244+
.map(|ts| ts.0.len())
257245
.sum();
258246

259-
// Get the first stream. If it's `None`, create an empty
260-
// stream.
261-
let mut iter = streams.drain(..);
262-
let mut first_stream_lrc = iter.next().unwrap().0;
263-
264-
// Append the elements to the first stream, after reserving
265-
// space for them.
266-
let first_vec_mut = Lrc::make_mut(&mut first_stream_lrc);
267-
first_vec_mut.reserve(num_appends);
268-
for stream in iter {
269-
first_vec_mut.extend(stream.0.iter().cloned());
270-
}
247+
let mut vec = Vec::with_capacity(tree_count);
271248

272-
// Create the final `TokenStream`.
273-
TokenStream(first_stream_lrc)
249+
for stream in streams {
250+
vec.extend(stream.0.iter().cloned());
251+
}
252+
TokenStream::new(vec)
274253
}
275254
}
276255
}

0 commit comments

Comments
 (0)