Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Signal handler in Magisk hide causes deadlock #3976

Closed
RikkaW opened this issue Mar 4, 2021 · 4 comments
Closed

Signal handler in Magisk hide causes deadlock #3976

RikkaW opened this issue Mar 4, 2021 · 4 comments

Comments

@RikkaW
Copy link
Contributor

RikkaW commented Mar 4, 2021

Logs

magisk.log
logcat.txt (filtered with magiskd)

Update 2021-03-05

The signal handler causes the deadlock (during magisk_log, localtime_r locks, then signal comes, another magisk_log and localtime_r start, deadlock 😶).

It's highly possible that many weird problems could be explained by this. For example, there are people who encountered the problem of random mount blocked.

libmagisk64.zip (magiskd binary, version 22001)

maps.txt

bt_all.txt (lldb bt all)

thread 2 is Magisk hide.

  thread #2, name = 'magiskd', stop reason = signal SIGSTOP
    frame #0: 0x0000007e9705422c libc.so`syscall + 28
    frame #1: 0x0000007e97058c18 libc.so`__futex_wait_ex(void volatile*, bool, int, bool, timespec const*) + 148
    frame #2: 0x0000007e970ba7c4 libc.so`NonPI::MutexLockWithTimeout(pthread_mutex_internal_t*, bool, timespec const*) + 224
    frame #3: 0x0000007e970a6328 libc.so`localtime_r + 36
    frame #4: 0x000000628b3f4790 magisk64`___lldb_unnamed_symbol64$$magisk64 + 192
    frame #5: 0x000000628b3f1b78 magisk64`___lldb_unnamed_symbol21$$magisk64 + 20
    frame #6: 0x000000628b40385c magisk64`___lldb_unnamed_symbol561$$magisk64 + 32
    frame #7: 0x000000628b3fe498 magisk64`___lldb_unnamed_symbol409$$magisk64 + 304
    frame #8: 0x000000628b3f9dc0 magisk64`___lldb_unnamed_symbol257$$magisk64 + 96
    frame #9: 0x000000628b3fe1e0 magisk64`___lldb_unnamed_symbol399$$magisk64 + 44
    frame #10: 0x000000628b3fe190 magisk64`___lldb_unnamed_symbol397$$magisk64 + 132
    frame #11: 0x0000007e9b4fc7e8 [vdso]`__kernel_rt_sigreturn
    frame #12: 0x0000007e97058c18 libc.so`__futex_wait_ex(void volatile*, bool, int, bool, timespec const*) + 148
    frame #13: 0x0000007e970ba7c4 libc.so`NonPI::MutexLockWithTimeout(pthread_mutex_internal_t*, bool, timespec const*) + 224
    frame #14: 0x0000007e970a6328 libc.so`localtime_r + 36
    frame #15: 0x000000628b3f4790 magisk64`___lldb_unnamed_symbol64$$magisk64 + 192
    frame #16: 0x000000628b3f1b78 magisk64`___lldb_unnamed_symbol21$$magisk64 + 20
    frame #17: 0x000000628b40385c magisk64`___lldb_unnamed_symbol561$$magisk64 + 32
    frame #18: 0x000000628b3fe498 magisk64`___lldb_unnamed_symbol409$$magisk64 + 304
    frame #19: 0x000000628b3f9dc0 magisk64`___lldb_unnamed_symbol257$$magisk64 + 96
    frame #20: 0x000000628b3fe1e0 magisk64`___lldb_unnamed_symbol399$$magisk64 + 44
    frame #21: 0x0000007e9b4fc7e8 [vdso]`__kernel_rt_sigreturn
    frame #22: 0x0000007e970abb84 libc.so`__bionic_open_tzdata_path(char const*, char const*, int*) + 392
    frame #23: 0x0000007e970ab9a0 libc.so`__bionic_open_tzdata + 68
    frame #24: 0x0000007e970a6b68 libc.so`tzload + 108
    frame #25: 0x0000007e970a5e50 libc.so`zoneinit + 56
    frame #26: 0x0000007e970a5cac libc.so`tzsetlcl + 148
    frame #27: 0x0000007e970ab8f0 libc.so`tzset_unlocked + 336
    frame #28: 0x0000007e970a6344 libc.so`localtime_r + 64
    frame #29: 0x000000628b3f4790 magisk64`___lldb_unnamed_symbol64$$magisk64 + 192
    frame #30: 0x000000628b3f1b78 magisk64`___lldb_unnamed_symbol21$$magisk64 + 20
    frame #31: 0x000000628b40385c magisk64`___lldb_unnamed_symbol561$$magisk64 + 32
    frame #32: 0x000000628b3fdca4 magisk64`___lldb_unnamed_symbol395$$magisk64 + 616
    frame #33: 0x000000628b40334c magisk64`___lldb_unnamed_symbol537$$magisk64 + 8
    frame #34: 0x0000007e970b973c libc.so`__pthread_start(void*) + 68
    frame #35: 0x0000007e97059684 libc.so`__start_thread + 68

🎉

With the help of Sui, I can get a root shell when "su" from magisk is unavailable, or this problem may never be investigated.
With the help from @yujincheng08, who teaches me how to use lldb, I can get the full backtrace which "debuggerd" cannot.

@pndwal

This comment has been minimized.

@yujincheng08
Copy link
Collaborator

yujincheng08 commented Mar 4, 2021

A similar issue was found by one of the LSPosed users: his module mounting was stuck. And, amazingly, when he removed sepolicy.rule file of the LSPosed module, the mounting became normal. I afraid there would be no tombstone because the mounting was not finished. Here are some logs:
log_without_removing_sepolicy
log_with_sepolicy_removed

Besides, once @vvb2060 added some logs to magisk, it amazingly worked without removing sepolicy.rule.


well, after debugging with rikka, it seems that her problem is different from mine.


miracle


just confirmed with that user that magisk works with magisk hide disabled. So, I suppose its the same problem.

@RikkaW RikkaW changed the title Possible undefined behavior cause deadlock Signal handler in Magisk hide causes deadlock Mar 4, 2021
@RikkaW
Copy link
Contributor Author

RikkaW commented Mar 4, 2021

Update 2021-03-05

Found the real reason, wrong inference removed

@RikkaW
Copy link
Contributor Author

RikkaW commented Mar 4, 2021

Update 2021-03-05

😋🎉:
With the help of Sui, I can get a root shell when "su" from magisk is unavailable, or this problem may never be investigated.
With the help from @yujincheng08, who teaches me how to use lldb, I can get the full backtrace which "debuggerd" cannot.

kubalav pushed a commit to kubalav/Magisk that referenced this issue Sep 13, 2021
- Block signals in logging routine (fix topjohnwu#3976)
- Prevent possible deadlock after fork (stdio locks internally)
  by creating a new FILE pointer per logging call (thread/stack local)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants
@yujincheng08 @RikkaW @pndwal and others