Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

librespot crashes - main thread panic #103

Closed
sashahilton00 opened this issue Jan 29, 2018 · 11 comments
Closed

librespot crashes - main thread panic #103

sashahilton00 opened this issue Jan 29, 2018 · 11 comments

Comments

@sashahilton00
Copy link
Member

Issue by pwitka2
Monday Jan 22, 2018 at 20:10 GMT
Originally opened as plietar/librespot#273


librespot crashes with such error message:

INFO:librespot::player: Track "Sweet Pea" loaded
thread 'main' panicked at 'Box', core/src/session.rs:83:42
note: Run with RUST_BACKTRACE=1 for a backtrace.

It happened after long idle (paused) time.

I have raspbian wheezy
raspotify 0.9.1~librespot.20171005T125424Z.8971d3a

@sashahilton00
Copy link
Member Author

Comment by kingosticks
Monday Jan 22, 2018 at 20:52 GMT


I would suggest you run RUST_BACKTRACE=1 and provide the backtrace.

@sashahilton00
Copy link
Member Author

Comment by pwitka2
Wednesday Jan 24, 2018 at 19:16 GMT


Here is a backtrace, but app crashes in different place:
INFO:librespot::player: Track "Magic Wind" loaded
thread 'main' panicked at 'connection closed', /checkout/src/libcore/option.rs:819:4
stack backtrace:
0:
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16: __libc_start_main

@sashahilton00
Copy link
Member Author

Comment by kingosticks
Wednesday Jan 24, 2018 at 19:23 GMT


Raspotify seem to have decided to strip the binary. That's not helpful.

@ComlOnline
Copy link
Contributor

Following a conversation with the dev of raspotify he was going to release a new version based on this fork. I'd wait for that and try again.

@ComlOnline
Copy link
Contributor

ComlOnline commented Jan 29, 2018

Just checked he's already done it: https://github.com/dtcooper/raspotify/releases/tag/0.10
Try to cause the error again and let us know

@ComlOnline ComlOnline added the bug label Jan 29, 2018
@sashahilton00
Copy link
Member Author

This issue is still present, I had it this morning. It's like he said, it crops up after no music has been playing for a while (and potentially internet has disconnected/reconnected). I'll run with a backtrace later and see if i can get a trace.

@sashahilton00
Copy link
Member Author

sashahilton00 commented Jan 29, 2018

@kingosticks @ComlOnline here is a backtrace for this issue. I triggered it by playing a song through librespot, then disconnecting the internet and waiting for the buffered music to finish playing. What looks to happen is at the end of the song, the track load is called, and since it is disconnected, the session is dropped, and this is not handled. Full trace below:

DEBUG:librespot::spirc: kMessageTypeNotify "LRS" dc956dabe1c2e64e61d0aa7b96d4093c70896df9 21 1517251454413
DEBUG:librespot_core::session: Session[0] strong=3 weak=3
DEBUG:librespot::player: command=Load(SpotifyId(u128 { high: 15501048872228962569, low: 12963269621848591226 }), true, 0)
DEBUG:librespot_core::session: drop Dispatch
thread 'main' panicked at 'Box<Any>', core/src/session.rs:83:42
stack backtrace:
   0: std::sys::imp::backtrace::tracing::imp::unwind_backtrace
             at src/libstd/sys/unix/backtrace/tracing/gcc_s.rs:49
   1: std::sys_common::backtrace::_print
             at src/libstd/sys_common/backtrace.rs:68
   2: std::panicking::default_hook::{{closure}}
             at src/libstd/sys_common/backtrace.rs:57
             at src/libstd/panicking.rs:381
   3: std::panicking::default_hook
             at src/libstd/panicking.rs:397
   4: std::panicking::begin_panic
             at src/libstd/panicking.rs:577
   5: std::panicking::begin_panic
             at /Users/travis/build/rust-lang/rust/src/libstd/panicking.rs:538
   6: librespot_core::session::Session::connect::{{closure}}::{{closure}}
             at core/src/session.rs:83
   7: <core::result::Result<T, E>>::map_err
             at /Users/travis/build/rust-lang/rust/src/libcore/result.rs:500
   8: <futures::future::map_err::MapErr<A, F> as futures::future::Future>::poll
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/future/map_err.rs:34
   9: <alloc::boxed::Box<F> as futures::future::Future>::poll
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/future/mod.rs:107
  10: <futures::task_impl::Spawn<F>>::poll_future_notify::{{closure}}
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:291
  11: <futures::task_impl::Spawn<T>>::enter::{{closure}}
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:352
  12: futures::task_impl::std::set
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/std/mod.rs:90
  13: <futures::task_impl::Spawn<T>>::enter
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:352
  14: <futures::task_impl::Spawn<F>>::poll_future_notify
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:291
  15: tokio_core::reactor::Core::dispatch_task::{{closure}}
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:356
  16: <scoped_tls::ScopedKey<T>>::set
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/scoped-tls-0.1.0/src/lib.rs:135
  17: tokio_core::reactor::Core::dispatch_task
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:355
  18: tokio_core::reactor::Core::dispatch
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:316
  19: futures::task_impl::Notify::clone_id
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:304
  20: tokio_core::reactor::Core::run
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:241
  21: librespot::main
             at src/main.rs:377
  22: panic_unwind::dwarf::eh::read_encoded_pointer
             at src/libpanic_unwind/lib.rs:101
  23: rust_panic
             at src/libstd/panicking.rs:459
             at src/libstd/panic.rs:365
             at src/libstd/rt.rs:58
  24: librespot::main
DEBUG:librespot::spirc: drop Spirc[0]
DEBUG:librespot::player: Shutting down player thread ...

@michalfita
Copy link

I can add this happens when connection to the Internet is dropped (router up, but DSL dies).

Would it be at least possible that the process:

pi@raspberrypi:~$ ps aux | grep respot
raspoti+   872  3.5  1.3  33716 12184 ?        Ssl  lut05  75:03 /usr/bin/librespot --name Malina ∏ 3 --backend alsa --bitrate 320 --disable-audio-cache
pi        2049  0.0  0.2   5992  1884 pts/0    S+   22:44   0:00 grep respot

really dies on crash, so systemd could restart it?

If it's fixed, we need to wait for next release of raspotify.

@awiouy
Copy link
Collaborator

awiouy commented Feb 8, 2018

From

This method will not catch panics from polling the future f. If the future panics then it's the responsibility of the caller to catch that panic and handle it as appropriate.

Is that done in session.rs? If so, where? If not, where should it be done and what should it do?

@sashahilton00
Copy link
Member Author

I'm not sure where this is implemented exactly, as I haven't looked at the code that handles this, but it's my understanding that the session logic is due for a rewrite to enable handling of things like reconnection, as it has been mentioned several times in the past, most recently by @plietar in Gitter on Feb 8:

Is there a tracking issue for reconnecting automatically when the connection is broken? I had a few ideas for implementation from a while ago I'd like to write down.

Hence, I've created #134 to track suggestions/improvements that can be made to the existing session logic, or in a rewrite of it.

@plietar
Copy link
Contributor

plietar commented Feb 14, 2018

Since this is just a re-connection issue I'm going to close in favour of #134

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants