Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compiler panic: unexpected unsized tail: TyProjection(ProjectionTy { substs: Slice([chrono::Utc]), ... #48336

Closed
faern opened this issue Feb 18, 2018 · 24 comments
Assignees
Labels
C-bug Category: This is a bug. I-ICE Issue: The compiler panicked, giving an Internal Compilation Error (ICE) ❄️ P-high High priority regression-from-stable-to-stable Performance or correctness regression from one stable version to another. T-compiler Relevant to the compiler team, which will review and decide on the PR/issue.

Comments

@faern
Copy link
Contributor

faern commented Feb 18, 2018

Just after upgrading to stable Rust 1.24, and the nightly released at approximately the same time we started experiencing compiler crashes on both Windows, Linux and macOS on both Travis/Appveyor and locally.

error: internal compiler error: librustc_trans/context.rs:446: unexpected unsized tail: TyProjection(ProjectionTy { substs: Slice([chrono::Utc]), item_def_id: DefId(13/0:83 ~ chrono[e167]::offset[0]::TimeZone[0]::Offset[0]) })

I tried this code:

I sadly don't have a minimized example triggering the bug. Because the error is happening randomly, and most often on the CI and is thus hard to reproduce. But it happens on more than one feature branch of this workspace repository: https://github.com/mullvad/mullvadvpn-app. It has for example been observed on commit 33a2edadad1c3b5fe7ea667f6b7ad4dd97c69ec3.

EDIT: Just corrected the repo url. It was wrong :P

Meta

rustc --version --verbose:

rustc 1.25.0-nightly (3ec5a99aa 2018-02-14)
binary: rustc
commit-hash: 3ec5a99aaa0084d97a9e845b34fdf03d1462c475
commit-date: 2018-02-14
host: x86_64-unknown-linux-gnu
release: 1.25.0-nightly
LLVM version: 6.0

But the same thing has happened on 1.24 stable as well.

Backtrace:
Below is from my local run on Linux. Here, a similar error can be seen for macOS on Travis: https://travis-ci.org/mullvad/mullvadvpn-app/jobs/343143393#L311

error: internal compiler error: librustc_trans/context.rs:446: unexpected unsized tail: TyProjection(ProjectionTy { substs: Slice([chrono::Utc]), item_def_id: DefId(13/0:83 ~ chrono[e167]::offset[0]::TimeZone[0]::Offset[0]) })

thread 'rustc' panicked at 'Box<Any>', librustc_errors/lib.rs:535:9
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
stack backtrace:
   0: std::sys::unix::backtrace::tracing::imp::unwind_backtrace
             at libstd/sys/unix/backtrace/tracing/gcc_s.rs:49
   1: std::sys_common::backtrace::_print
             at libstd/sys_common/backtrace.rs:71
   2: std::panicking::default_hook::{{closure}}
             at libstd/sys_common/backtrace.rs:59
             at libstd/panicking.rs:380
   3: std::panicking::default_hook
             at libstd/panicking.rs:396
   4: std::panicking::rust_panic_with_hook
             at libstd/panicking.rs:576
   5: std::panicking::begin_panic
   6: rustc_errors::Handler::bug
   7: rustc::session::opt_span_bug_fmt::{{closure}}
   8: rustc::ty::context::tls::with_opt::{{closure}}
   9: <std::thread::local::LocalKey<T>>::try_with
  10: <std::thread::local::LocalKey<T>>::with
  11: rustc::ty::context::tls::with
  12: rustc::ty::context::tls::with_opt
  13: rustc::session::opt_span_bug_fmt
  14: rustc::session::bug_fmt
  15: rustc_trans::context::CodegenCx::type_has_metadata
  16: rustc_trans::mir::place::PlaceRef::project_field::{{closure}}
  17: rustc_trans::mir::place::PlaceRef::project_field
  18: rustc_trans::mir::place::<impl rustc_trans::mir::FunctionCx<'a, 'tcx>>::trans_place
  19: rustc_trans::mir::operand::<impl rustc_trans::mir::FunctionCx<'a, 'tcx>>::trans_consume
  20: rustc_trans::mir::operand::<impl rustc_trans::mir::FunctionCx<'a, 'tcx>>::trans_operand
  21: rustc_trans::mir::rvalue::<impl rustc_trans::mir::FunctionCx<'a, 'tcx>>::trans_rvalue
  22: rustc_trans::mir::trans_mir
  23: rustc_trans::base::trans_instance
  24: rustc_trans::base::compile_codegen_unit
  25: rustc::dep_graph::graph::DepGraph::with_task_impl
  26: rustc::ty::maps::<impl rustc::ty::maps::queries::compile_codegen_unit<'tcx>>::force
  27: rustc::ty::maps::<impl rustc::ty::maps::queries::compile_codegen_unit<'tcx>>::try_get
  28: rustc::ty::maps::TyCtxtAt::compile_codegen_unit
  29: rustc::ty::maps::<impl rustc::ty::context::TyCtxt<'a, 'tcx, 'lcx>>::compile_codegen_unit
  30: rustc_trans::base::trans_crate
  31: <rustc_trans::LlvmTransCrate as rustc_trans_utils::trans_crate::TransCrate>::trans_crate
  32: rustc_driver::driver::phase_4_translate_to_llvm
  33: rustc_driver::driver::compile_input::{{closure}}
  34: rustc::ty::context::TyCtxt::create_and_enter
  35: rustc_driver::driver::compile_input
  36: rustc_driver::run_compiler

note: the compiler unexpectedly panicked. this is a bug.

note: we would appreciate a bug report: https://github.com/rust-lang/rust/blob/master/CONTRIBUTING.md#bug-reports

note: rustc 1.25.0-nightly (3ec5a99aa 2018-02-14) running on x86_64-unknown-linux-gnu

error: Could not compile `mullvad-daemon`.
@pietroalbini pietroalbini added I-ICE Issue: The compiler panicked, giving an Internal Compilation Error (ICE) ❄️ T-compiler Relevant to the compiler team, which will review and decide on the PR/issue. C-bug Category: This is a bug. regression-from-stable-to-stable Performance or correctness regression from one stable version to another. labels Feb 20, 2018
@nikomatsakis
Copy link
Contributor

triage: P-high

First priority is clearly to see if we can reproduce the problem. I'll try to figure that out.

@rust-highfive rust-highfive added the P-high High priority label Feb 22, 2018
@nikomatsakis nikomatsakis self-assigned this Feb 22, 2018
@faern
Copy link
Contributor Author

faern commented Feb 22, 2018

I have tried a little bit to reduce a smaller code example. But as soon as I touch any code and recompile the error usually goes away.

@nikomatsakis
Copy link
Contributor

We are hoping it's not related to #47381 -- this came up in the @rust-lang/compiler meeting. I think it was meant as a joke, but now it seems eminently plausible. =)

@pnkfelix
Copy link
Member

pnkfelix commented Mar 1, 2018

Reassigning to self to attempt to reproduce (and reduce etc).

@pnkfelix pnkfelix self-assigned this Mar 1, 2018
@pnkfelix
Copy link
Member

pnkfelix commented Mar 2, 2018

Progress report (or lack thereof): I've made two reproduction attempts atop OS X, both based largely on the provided travis log. The main difference I noted was that I had to omit the --branch=fix-windows-warnings when cloning the repo because that branch has been deleted. But I was still able to checkout the specific commit noted in the travis log, namely commit 08f6eb436f03ba0c68d2a1e460e6ca42048124d6 before I did the submodule update and build attempt

Atop stable rustc 1.24.0 (4d90ac3 2018-02-12): no ICE observed from cargo build.

Atop nightly rustc 1.25.0-nightly (3ec5a99 2018-02-14): no ICE observed from cargo build.

I'll try with my Linux box next.

@pnkfelix
Copy link
Member

pnkfelix commented Mar 2, 2018

On my Linux (Fedora) system, tried two different base commits with one fixed version of rustc:

rustc 1.25.0-nightly (27a046e 2018-02-18), commit 08f6eb436f03ba0c68d2a1e460e6ca42048124d6: no ICE observed from cargo build

rustc 1.25.0-nightly (27a046e 2018-02-18), commit 33a2edadad1c3b5fe7ea667f6b7ad4dd97c69ec3: no ICE observed from cargo build

@faern
Copy link
Contributor Author

faern commented Mar 2, 2018

I should probably provide an update as well. When I reported this our CI hit the bug in like 20-50% of the builds. This number has gone down and now AppVeyor + Travis CI almost never hit the bug.

However, the bug is not gone. Because I get it now and then in my local environment. But not very often. So very hard to reproduce.

@pnkfelix
Copy link
Member

pnkfelix commented Mar 2, 2018

@faern when you say the error is happening "randomly", did you mean it seems to come and go as you make changes to the code or check out different commits, but reproduces consistently with a fixed code base and compiler version?

Or did you mean that it sometimes fails to reproduce even when you leave the code base and compiler version fixed?

(In other words, is this one of those bugs that appears to depend on the phase of the moon?)

@pnkfelix
Copy link
Member

pnkfelix commented Mar 2, 2018

@faern also, if you haven't already, you may want to see if you can reproduce the ICE when running rustc under rr, just to potentially work your way backwards to the cause.

@pnkfelix
Copy link
Member

pnkfelix commented Mar 2, 2018

@rust-lang/compiler Based on my own failure to reproduce and the update from @faern on the reduced frequency of the bug cropping up, I am tempted to either close the ticket outright (as "cannot reproduce") or at least reduce the priority to P-medium.

@faern
Copy link
Contributor Author

faern commented Mar 2, 2018

@pnkfelix It comes and go on the same compiler version, yes. Once I get the bug I get it no matter how many times I re-run cargo build. The only way to make it go away is to either make some substantial change to the code or a cargo clean. Since a cargo clean fixes the problem it means that the exact same compiler and code can properly build sometimes and get the bug sometimes. Which makes me think the invalid state is in some intermediate code representation in target/.

@faern
Copy link
Contributor Author

faern commented Mar 2, 2018

@pnkfelix Next time I hit the error I will re-compile with rr and see if I can get more information about it.

EDIT: How do you recommend I use rr (on Linux)? I'm not familiar with that tool.

@nikomatsakis
Copy link
Contributor

cc @michaelwoerister

Since a cargo clean fixes the problem it means that the exact same compiler and code can properly build sometimes and get the bug sometimes. Which makes me think the invalid state is in some intermediate code representation in target/.

Maybe a problem with incremental.

@michaelwoerister
Copy link
Member

Maybe a problem with incremental.

Maybe. Or something in crate metadata.

@jvff
Copy link

jvff commented Mar 6, 2018

I've managed to reproduce this using some recent commits in the same mullvadvpn-app repository. The following steps seem to reproduce it (tested twice):

cargo clean
git checkout 982c42a # Previous master position
cargo build
git checkout 3e0d994 # Current master position
cargo build
git checkout 6090dc0 # Commit that causes the problem, in branch fix-openvpn-plugin-path
cargo build

@faern
Copy link
Contributor Author

faern commented Mar 6, 2018

I manage to reproduce it on Linux with rustc 1.24 stable and 1.26.0-nightly (259e4a6 2018-03-04). Awesome @jvff!
@pnkfelix Now you might be able to trace down the problem easier. I'll see if I can get something with rr, but I bet you are more efficient at tracking down the problem.

@pnkfelix
Copy link
Member

pnkfelix commented Mar 8, 2018

Well, I tried to follow @jvff's instructions, but apparently the commit 6090dc0 and the branch fix-openvpn-plugin-path are no longer available from the mullvad repository. 😢

@faern
Copy link
Contributor Author

faern commented Mar 8, 2018

@pnkfelix I have pushed a branch named rustc-bug-commit pointing to that commit a few days ago to make sure github does not remove it.
EDIT: Just tried a fresh clone from github. I can reach all three commits using the hashes from @jvff

@jvff
Copy link

jvff commented Mar 8, 2018

@pnkfelix Oops, I deleted the branch after I rebased and merged it. Sorry.

Thanks @faern for restoring access to the commit.

@pnkfelix
Copy link
Member

pnkfelix commented Mar 9, 2018

Okay, I have now reproduced the bug atop rustc 1.26.0-nightly (2789b067d 2018-03-06)

(Interestingly, I had to explicit name the rustc-bug-commit branch when running git fetch; that is, I had to do git fetch origin rustc-bug-commit. I had thought from my past experience that git fetch $remote would grab all branches from $remote, but apparently I was wrong.)

Next step: make sure it reproduces under a local build of rustc, and if so, then I'll jump into rr and see if I can dissect.

@nikomatsakis
Copy link
Contributor

ping @pnkfelix from @rust-lang/compiler -- were you able to make progress reproducing with local build / rr ?

@pnkfelix
Copy link
Member

Yes I have reproduced on a local build, but not with the latest master, the reproduction required backtracking to the same commit that was used on the nightly I wrote above (i.e. 2789b06).

This might be a sign that this bug has been fixed. But given the transient nature of the bug, I am hesitant to draw that conclusion without at least bisecting to the commit where the bug disappears. I have not actually attempted that, however, because I wanted to investigate more directly atop the local build. (Which I started doing, but failed to learn much of use beyond what is already in the provided ICE message and stack trace...)

Anyway, there are a couple different paths that I could see trying going down to dissect this:

  1. Directly work one's way through the control-flow of the rustc execution that causes this bug, potentially using rr as a way to start from the end and work one's way back. This is what I started doing.
  2. Try to narrow the scope of the input source code down to something smaller. I didn't bother attempting this given that @faern already said that they were not able to successfully minimize the bug on their end. Still, its possible that we might make some small headway via tools like -Z every_body_loops
  3. Bisect the rustc history itself (from 2789b06 to master), and see if there is a clear history of the bug reproducing for a period of time; if so, then maybe the PR that fixed it will give us some clue as to what the underlying problem is/was.

Of the options above, 3 is the one that is perhaps easiest for anyone to try their hand at (assuming they are familiar with git bisect and how to use it well with rustc's commit history), once they understand how to actually reproduce the problem itself given the instructions above.

  • If one wants to try this route, then I would caution that one should definitely not blindly trust git bisect here. We do not know if this problem is one where the bug will transiently come and go based on unrelated factors in the rustc binary's object code. So it would be a good idea to sample a couple different points in the history and see if there is a clear trend of "bug; ... ; bug; ...; bug; ...; working; ... working; ... working; ...", which would give us hope that we may learn something from a full bisection. If you see "bug; ... bug; ... working; ... bug; ... working; ... bug; ... working; ...", then that's a hint that full bisection could likely be a waste of time.

@pnkfelix
Copy link
Member

pnkfelix commented Mar 21, 2018

I decided as a background task to go ahead and try bisecting.

Huge caveat: I did not take my own advice, in that I have not yet double-checked that the commit series looks like "bug; bug; ...; bug; working; working; ... working"

  • Basically, git bisect makes it far too easy to not attempt to double-check the above claim.
  • But of course now that I have identified a potential candidate, one can perform such verification after the fact.

Anyway, here's what the bisection looked like (with some edits to avoid having markdown create tracking links on presumably irrelevant PRs). (Keep in mind that "bad" in this context means "with this build of the compiler, the bug did not arise.)

So, was this bug "fixed" by #48710 ...? Or is that just an incidental change that hides the symptom and papers over the actual bug...? Not sure yet.

@pnkfelix
Copy link
Member

After discussion at compiler team meeting, closing under hypothesis that #48710 "fixed" this by removing the inferred_obligations, which were sometimes incorrectly dropped rather than proven (and which @nikomatsakis says could indeed lead to ICE's like this.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
C-bug Category: This is a bug. I-ICE Issue: The compiler panicked, giving an Internal Compilation Error (ICE) ❄️ P-high High priority regression-from-stable-to-stable Performance or correctness regression from one stable version to another. T-compiler Relevant to the compiler team, which will review and decide on the PR/issue.
Projects
None yet
Development

No branches or pull requests

7 participants