You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Auto merge of #95031 - compiler-errors:param-env-cache, r=Aaron1011
Do not use `ParamEnv::and` when building a cache key from a param-env and trait eval candidate
Do not use `ParamEnv::and` to cache a param-env with a selection/evaluation candidate.
This is because if the param-env is `RevealAll` mode, and the candidate looks global (i.e. it has erased regions, which can show up when we normalize a projection type under a binder<sup>1</sup>), then when we use `ParamEnv::and` to pair the candidate and the param-env for use as a cache key, we will throw away the param-env's caller bounds, and we'll end up caching a candidate that we inferred from the param-env with a empty param-env, which may cause cache-hit later when we have an empty param-env, and possibly mess with normalization like we see in the referenced issue during codegen.
Not sure how to trigger this with a more structured test, but changing `check-pass` to `build-pass` triggers the case that #94903 detected.
<sup>1.</sup> That is, we will replace the late-bound region with a placeholder, which gets canonicalized and turned into an infererence variable, which gets erased during region freshening right before we cache the result. Sorry, it's quite a few steps.
Fixes#94903
r? `@Aaron1011` (or reassign as you see fit)
0 commit comments