Skip to content

Remove special-casing for argument patterns in MIR typeck (attempt to fix perf regression of #133858) #135273

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jan 10, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 0 additions & 13 deletions compiler/rustc_borrowck/src/type_check/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -892,19 +892,6 @@ impl<'a, 'tcx> TypeChecker<'a, 'tcx> {
Some(l) if !body.local_decls[l].is_user_variable() => {
ConstraintCategory::Boring
}
Some(_)
if let Some(body_id) = tcx
.hir_node_by_def_id(body.source.def_id().expect_local())
.body_id()
&& let params = tcx.hir().body(body_id).params
&& params
.iter()
.any(|param| param.span.contains(stmt.source_info.span)) =>
{
// Assignments generated from lowering argument patterns shouldn't be called
// "assignments" in diagnostics and aren't interesting to blame for errors.
Comment on lines -904 to -905
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe differentiating the assignments generated from lowering argument patterns to where they are lowered instead would still allow to do this diagnostics improvement at a lower cost -- or the opposite, using the same method but on the error path, where we know it's not as perf-sensitive.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unfortunately, using the same method only on the error path doesn't seem clean to me, at least in the general case. The constraint category is used by best_blame_constraint, which runs during borrowck on the happy path when outlives constraints from closures need to be propagated to whatever body they're defined in. We could maybe track in best_blame_constraint whether we're emitting an error or checking a closure, but we'd risk not blaming a propagated outlives constraint from a closure if it happened to be from an argument pattern. There's already a bit of weirdness with not blaming ideal constraints when they're propagated from closures, so I'd rather not make that more confusing.

Differentiating assignments during lowering would work, I think. I'd looked into putting the information in the assignment's span (as a new kind of desugaring) to avoid needing to store anything extra in the MIR itself. I opted not to go for that though since it looked like keeping track of which assignments came from argument patterns would require an unpleasant amount of bookkeeping during either MIR or THIR construction. It wouldn't be hard, but the added complexity didn't seem worth it to me for such a niche diagnostic tweak1. I don't think I checked how much bookkeeping it'd be to do elsewhere, though.

Another option would be to give up (for now) on preventing assignments from argument patterns from being blamed, and instead just pick a different name for assignments (like "binding", maybe2) when they're syntactically not from assignment-like syntax. This should be doable entirely on the error path, and would also mean we could avoid calling match expressions "assignments" too if we'd like.

We could also try an entirely different heuristic to hopefully better address the issue this hack was papering over (see the footnote below) but it'd be a substantial enough change that it should have its own PR, and I'm honestly not confident the fix I have in mind would work. There might be some other less-fragile/hacky way of writing best_blame_constraint altogether, but I'm not sure how if so. borrowck as written makes it difficult to tell exactly why region errors occur, hence the need for best_blame_constraint to divine some meaning.

I'm not sure which of these options would be best to pursue, but I think just picking a different word for assignments from argument patterns (and maybe also match expressions) would be easiest and least intrusive.

Footnotes

  1. I think the argument pattern is only being blamed here instead of the more relevant assignment in the function body because of how we handle invariance in diagnostics. The main trick best_blame_constraint uses to guess what to blame for region errors breaks down when the regions involved are related contravariantly, and we currently don't have a reliable way of knowing when that's the case. Hence all the heuristics like that one to try and blame something hopefully-reasonable even in those cases.

  2. I'm not totally sure on what word would be best. The nice thing about "assignment" is it's very clear about the relationship being described.

ConstraintCategory::Boring
}
_ => ConstraintCategory::Assignment,
};
debug!(
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
fn foo(&mut (ref mut v, w): &mut (&u8, &u8), x: &u8) {
*v = x;
//~^ ERROR lifetime may not live long enough
*v = x;
}

fn main() { }
Original file line number Diff line number Diff line change
@@ -1,13 +1,15 @@
error: lifetime may not live long enough
--> $DIR/ex3-both-anon-regions-2.rs:2:5
--> $DIR/ex3-both-anon-regions-2.rs:1:14
|
LL | fn foo(&mut (ref mut v, w): &mut (&u8, &u8), x: &u8) {
| - - let's call the lifetime of this reference `'1`
| |
| let's call the lifetime of this reference `'2`
LL | *v = x;
| ^^^^^^ assignment requires that `'1` must outlive `'2`
| ^^^^^^^^^ - - let's call the lifetime of this reference `'1`
| | |
| | let's call the lifetime of this reference `'2`
| assignment requires that `'1` must outlive `'2`
|
= note: requirement occurs because of a mutable reference to `&u8`
= note: mutable references are invariant over their type parameter
= help: see <https://doc.rust-lang.org/nomicon/subtyping.html> for more information about variance
help: consider introducing a named lifetime parameter
|
LL | fn foo<'a>(&mut (ref mut v, w): &mut (&'a u8, &u8), x: &'a u8) {
Expand Down
Loading