Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segfault on Linux #1488

Closed
revmischa opened this issue Jun 14, 2022 · 7 comments
Closed

Segfault on Linux #1488

revmischa opened this issue Jun 14, 2022 · 7 comments

Comments

@revmischa
Copy link

Describe the bug

I am trying to convert my large project over to using vitest. I have a monorepo with some frontend and some node packages and I'm running several copies of vitest in parallel to test them. I can run all the tests just fine on my macOS dev machine.

When I run the tests in GitHub actions (ubuntu-latest), 2/3 of the node packages segfault and the frontend tests and one node package runs successfully.
I've tried run --run, --threads=false, min/max 1 thread, etc, no difference

I have no error message other than segmentation fault. If you can guide me how to get more debugging information on GHA (since I can't repro locally) I can provide that.

I posted about this on discord a few days ago: https://discord.com/channels/917386801235247114/918057998914568213/985767695926571028

One of the packages crashes after running tests for a couple minutes
The other crashes right after finishing everything:

[test:*common] > platform@1.0.0 test:common
[test:*common] > npm run -w platform-common test
[test:*common] 
[test:*common] 
[test:*common] > platform-common@1.0.0 test
[test:*common] > vitest run --run --threads=false
[test:*common] 
[test:*common] 
[test:*common]  RUN v0.14.2/home/runner/work/platform/platform/packages/common
[test:*common] 
[test:*common]  √ src/config/env.test.ts  (2 tests) 27ms 22 MB heap used
[test:*common]  √ src/util/array.test.ts  (1 test) 2ms 22 MB heap used
[test:*common]  √ src/util/foo.test.ts  (8 tests) 13ms 23 MB heap used
[test:*common] 
[test:*common] Test Files 3 passed (3)
[test:*common]      Tests 11 passed (11)
[test:*common]       Time  1.82s (in thread 42ms, 4333.93%)
[test:*common] 
[test:*common] Segmentation fault (core dumped)
[test:*common] npm ERR! Lifecycle script `test` failed with error: 

Reproduction

How do I get more info?

System Info

System:
    OS: macOS 12.4
    CPU: (12) x64 Intel(R) Core(TM) i7-9750H CPU @ 2.60GHz
    Memory: 952.24 MB / 16.00 GB
    Shell: 5.8.1 - /bin/zsh
  Binaries:
    Node: 16.15.0 - /usr/local/bin/node
    Yarn: 1.22.17 - /usr/local/bin/yarn
    npm: 8.12.1 - /usr/local/bin/npm
  Browsers:
    Brave Browser: 102.1.39.120
    Safari: 15.5
  npmPackages:
    @vitejs/plugin-react: ^1.3.2 => 1.3.2
    vitest: ^0.14.2 => 0.14.2


### Used Package Manager

npm

### Validations

- [X] Follow our [Code of Conduct](https://github.com/vitest-dev/vitest/blob/main/CODE_OF_CONDUCT.md)
- [X] Read the [Contributing Guidelines](https://github.com/vitest-dev/vitest/blob/main/CONTRIBUTING.md).
- [X] Read the [docs](https://vitest.dev/guide/).
- [X] Check that there isn't [already an issue](https://github.com/vitest-dev/vitest/issues) that reports the same bug to avoid creating a duplicate.
- [X] Check that this is a concrete bug. For Q&A open a [GitHub Discussion](https://github.com/vitest-dev/vitest/discussions) or join our [Discord Chat Server](https://chat.vitest.dev).
- [X] The provided reproduction is a [minimal reproducible example](https://stackoverflow.com/help/minimal-reproducible-example) of the bug.
@revmischa
Copy link
Author

revmischa commented Jun 15, 2022

I tried maxConcurrency: 1 on v0.15.0 with similar results. Really at a loss of how to debug this.
I tried putting

import SegfaultHandler from "segfault-handler"
SegfaultHandler.registerHandler("crash.log")

In my vite.config.ts but it didn't have any effect. Not sure if there is a better place to hook this in.

@OmgImAlexis
Copy link

You'll likely need the segfault-handler at the top of the test file.

@revmischa
Copy link
Author

I tried running it on a linux machine under lldb and got a stack trace at least:


Test Files  3 passed (3)
     Tests  11 passed (11)
      Time  1.67s (in thread 92ms, 1814.64%)


 PASS  Waiting for file changes...
       press h to show help, press q to quit
Process 23684 stopped
* thread #12, name = 'node', stop reason = signal SIGSEGV: invalid address (fault address: 0x7fffe5f2f300)
    frame #0: 0x00007fffe5f2f300
error: memory read failed for 0x7fffe5f2f200
(lldb) bt
* thread #12, name = 'node', stop reason = signal SIGSEGV: invalid address (fault address: 0x7fffe5f2f300)
  * frame #0: 0x00007fffe5f2f300
    frame #1: 0x00007ffff7c01431 libpthread.so.0`__nptl_deallocate_tsd at pthread_create.c:303:8
    frame #2: 0x00007ffff7c02471 libpthread.so.0`start_thread [inlined] __nptl_deallocate_tsd at pthread_create.c:258:6
    frame #3: 0x00007ffff7c0245e libpthread.so.0`start_thread(arg=0x00007fffe6ffd640) at pthread_create.c:484
    frame #4: 0x00007ffff7b24d53 libc.so.6`__clone + 67

@revmischa
Copy link
Author

Full trace

(lldb) bt all
  thread #1, name = 'node'
    frame #0: 0x00007ffff7c0f31e libpthread.so.0`__GI___futex_abstimed_wait_cancelable64 at futex-internal.c:74:11
    frame #1: 0x00007ffff7c0f2d0 libpthread.so.0`__GI___futex_abstimed_wait_cancelable64(futex_word=0x00007fffe6ffd910, expected=23712, clockid=<unavailable>, abstime=0x0000000000000000, private=<unavailable>) at futex-internal.c:123
    frame #2: 0x00007ffff7c03984 libpthread.so.0`__pthread_clockjoin_ex(threadid=140737068914240, thread_return=0x0000000000000000, clockid=0, abstime=0x0000000000000000, block=true) at pthread_join_common.c:102:14
    frame #3: 0x00007ffff7c037b3 libpthread.so.0`__pthread_join(threadid=<unavailable>, thread_return=<unavailable>) at pthread_join.c:24:10 [artificial]
    frame #4: 0x00000000013cf06e node`uv_thread_join(tid=<unavailable>) at thread.c:273:10
    frame #5: 0x0000000000af62ee node`node::worker::Worker::JoinThread() + 46
    frame #6: 0x00000000009d0b08 node`node::Environment::RunAndClearNativeImmediates(bool) + 1080
    frame #7: 0x00000000009d0e9c node`node::Environment::InitializeLibuv()::'lambda'(uv_async_s*)::_FUN(uv_async_s*) + 60
    frame #8: 0x00000000013c0a86 node`uv__async_io.part.1 at async.c:163:5
    frame #9: 0x00000000013d2ff4 node`uv__io_poll at epoll.c:374:11
    frame #10: 0x00000000013c13d8 node`uv_run(loop=0x00000000045bc8a0, mode=UV_RUN_DEFAULT) at core.c:389:5
    frame #11: 0x0000000000a7b642 node`node::NodeMainInstance::Run() + 546
    frame #12: 0x0000000000a03805 node`node::Start(int, char**) + 277
    frame #13: 0x00007ffff7a35565 libc.so.6`__libc_start_main + 213
    frame #14: 0x000000000098c58c node`_start + 41
  thread #2, name = 'node'
    frame #0: 0x00007ffff7b2509e libc.so.6`epoll_wait + 94
    frame #1: 0x00000000013d3197 node`uv__io_poll at epoll.c:236:14
    frame #2: 0x00000000013c13d8 node`uv_run(loop=0x00000000046132d8, mode=UV_RUN_DEFAULT) at core.c:389:5
    frame #3: 0x0000000000aacfdb node`node::WorkerThreadsTaskRunner::DelayedTaskScheduler::Start()::'lambda'(void*)::_FUN(void*) + 123
    frame #4: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007ffff7a07640) at pthread_create.c:473:8
    frame #5: 0x00007ffff7b24d53 libc.so.6`__clone + 67
  thread #3, name = 'node'
    frame #0: 0x00007ffff7c0f31e libpthread.so.0`__GI___futex_abstimed_wait_cancelable64 at futex-internal.c:74:11
    frame #1: 0x00007ffff7c0f2d0 libpthread.so.0`__GI___futex_abstimed_wait_cancelable64(futex_word=0x0000000004612eb0, expected=0, clockid=<unavailable>, abstime=0x0000000000000000, private=<unavailable>) at futex-internal.c:123
    frame #2: 0x00007ffff7c08540 libpthread.so.0`__pthread_cond_wait at pthread_cond_wait.c:504:10
    frame #3: 0x00007ffff7c08460 libpthread.so.0`__pthread_cond_wait(cond=0x0000000004612e88, mutex=0x0000000004612e60) at pthread_cond_wait.c:628
    frame #4: 0x00000000013cf759 node`uv_cond_wait at thread.c:780:7
    frame #5: 0x0000000000aa840b node`node::(anonymous namespace)::PlatformWorkerThread(void*) + 267
    frame #6: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007ffff7206640) at pthread_create.c:473:8
    frame #7: 0x00007ffff7b24d53 libc.so.6`__clone + 67
  thread #4, name = 'node'
    frame #0: 0x00007ffff7c0f31e libpthread.so.0`__GI___futex_abstimed_wait_cancelable64 at futex-internal.c:74:11
    frame #1: 0x00007ffff7c0f2d0 libpthread.so.0`__GI___futex_abstimed_wait_cancelable64(futex_word=0x0000000004612eb0, expected=0, clockid=<unavailable>, abstime=0x0000000000000000, private=<unavailable>) at futex-internal.c:123
    frame #2: 0x00007ffff7c08540 libpthread.so.0`__pthread_cond_wait at pthread_cond_wait.c:504:10
    frame #3: 0x00007ffff7c08460 libpthread.so.0`__pthread_cond_wait(cond=0x0000000004612e88, mutex=0x0000000004612e60) at pthread_cond_wait.c:628
    frame #4: 0x00000000013cf759 node`uv_cond_wait at thread.c:780:7
    frame #5: 0x0000000000aa840b node`node::(anonymous namespace)::PlatformWorkerThread(void*) + 267
    frame #6: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007ffff6a05640) at pthread_create.c:473:8
    frame #7: 0x00007ffff7b24d53 libc.so.6`__clone + 67
  thread #5, name = 'node'
    frame #0: 0x00007ffff7c0f31e libpthread.so.0`__GI___futex_abstimed_wait_cancelable64 at futex-internal.c:74:11
    frame #1: 0x00007ffff7c0f2d0 libpthread.so.0`__GI___futex_abstimed_wait_cancelable64(futex_word=0x0000000004612eb0, expected=0, clockid=<unavailable>, abstime=0x0000000000000000, private=<unavailable>) at futex-internal.c:123
    frame #2: 0x00007ffff7c08540 libpthread.so.0`__pthread_cond_wait at pthread_cond_wait.c:504:10
    frame #3: 0x00007ffff7c08460 libpthread.so.0`__pthread_cond_wait(cond=0x0000000004612e88, mutex=0x0000000004612e60) at pthread_cond_wait.c:628
    frame #4: 0x00000000013cf759 node`uv_cond_wait at thread.c:780:7
    frame #5: 0x0000000000aa840b node`node::(anonymous namespace)::PlatformWorkerThread(void*) + 267
    frame #6: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007ffff6204640) at pthread_create.c:473:8
    frame #7: 0x00007ffff7b24d53 libc.so.6`__clone + 67
  thread #6, name = 'node'
    frame #0: 0x00007ffff7c0f31e libpthread.so.0`__GI___futex_abstimed_wait_cancelable64 at futex-internal.c:74:11
    frame #1: 0x00007ffff7c0f2d0 libpthread.so.0`__GI___futex_abstimed_wait_cancelable64(futex_word=0x0000000004612eb0, expected=0, clockid=<unavailable>, abstime=0x0000000000000000, private=<unavailable>) at futex-internal.c:123
    frame #2: 0x00007ffff7c08540 libpthread.so.0`__pthread_cond_wait at pthread_cond_wait.c:504:10
    frame #3: 0x00007ffff7c08460 libpthread.so.0`__pthread_cond_wait(cond=0x0000000004612e88, mutex=0x0000000004612e60) at pthread_cond_wait.c:628
    frame #4: 0x00000000013cf759 node`uv_cond_wait at thread.c:780:7
    frame #5: 0x0000000000aa840b node`node::(anonymous namespace)::PlatformWorkerThread(void*) + 267
    frame #6: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007ffff5a03640) at pthread_create.c:473:8
    frame #7: 0x00007ffff7b24d53 libc.so.6`__clone + 67
  thread #7, name = 'node'
    frame #0: 0x00007ffff7c0f31e libpthread.so.0`__GI___futex_abstimed_wait_cancelable64 at futex-internal.c:74:11
    frame #1: 0x00007ffff7c0f2d0 libpthread.so.0`__GI___futex_abstimed_wait_cancelable64(futex_word=0x00000000045b56e0, expected=0, clockid=<unavailable>, abstime=0x0000000000000000, private=<unavailable>) at futex-internal.c:123
    frame #2: 0x00007ffff7c0b4a8 libpthread.so.0`__new_sem_wait_slow64(sem=0x00000000045b56e0, abstime=0x0000000000000000, clockid=0) at sem_waitcommon.c:184:10
    frame #3: 0x00000000013cf592 node`uv_sem_wait at thread.c:626:9
    frame #4: 0x00000000013cf580 node`uv_sem_wait(sem=0x00000000045b56e0) at thread.c:682
    frame #5: 0x0000000000b3c7e5 node`node::inspector::(anonymous namespace)::StartIoThreadMain(void*) + 53
    frame #6: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007ffff7fc2640) at pthread_create.c:473:8
    frame #7: 0x00007ffff7b24d53 libc.so.6`__clone + 67
  thread #8, name = 'node'
    frame #0: 0x00007ffff7c0f31e libpthread.so.0`__GI___futex_abstimed_wait_cancelable64 at futex-internal.c:74:11
    frame #1: 0x00007ffff7c0f2d0 libpthread.so.0`__GI___futex_abstimed_wait_cancelable64(futex_word=0x00000000045bc86c, expected=0, clockid=<unavailable>, abstime=0x0000000000000000, private=<unavailable>) at futex-internal.c:123
    frame #2: 0x00007ffff7c08540 libpthread.so.0`__pthread_cond_wait at pthread_cond_wait.c:504:10
    frame #3: 0x00007ffff7c08460 libpthread.so.0`__pthread_cond_wait(cond=0x00000000045bc840, mutex=0x00000000045bc800) at pthread_cond_wait.c:628
    frame #4: 0x00000000013cf759 node`uv_cond_wait at thread.c:780:7
    frame #5: 0x00000000013bbc94 node`worker(arg=0x0000000000000000) at threadpool.c:76:7
    frame #6: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007ffff5202640) at pthread_create.c:473:8
    frame #7: 0x00007ffff7b24d53 libc.so.6`__clone + 67
  thread #9, name = 'node'
    frame #0: 0x00007ffff7c0f31e libpthread.so.0`__GI___futex_abstimed_wait_cancelable64 at futex-internal.c:74:11
    frame #1: 0x00007ffff7c0f2d0 libpthread.so.0`__GI___futex_abstimed_wait_cancelable64(futex_word=0x00000000045bc86c, expected=0, clockid=<unavailable>, abstime=0x0000000000000000, private=<unavailable>) at futex-internal.c:123
    frame #2: 0x00007ffff7c08540 libpthread.so.0`__pthread_cond_wait at pthread_cond_wait.c:504:10
    frame #3: 0x00007ffff7c08460 libpthread.so.0`__pthread_cond_wait(cond=0x00000000045bc840, mutex=0x00000000045bc800) at pthread_cond_wait.c:628
    frame #4: 0x00000000013cf759 node`uv_cond_wait at thread.c:780:7
    frame #5: 0x00000000013bbc94 node`worker(arg=0x0000000000000000) at threadpool.c:76:7
    frame #6: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007ffff4a01640) at pthread_create.c:473:8
    frame #7: 0x00007ffff7b24d53 libc.so.6`__clone + 67
  thread #10, name = 'node'
    frame #0: 0x00007ffff7c0f31e libpthread.so.0`__GI___futex_abstimed_wait_cancelable64 at futex-internal.c:74:11
    frame #1: 0x00007ffff7c0f2d0 libpthread.so.0`__GI___futex_abstimed_wait_cancelable64(futex_word=0x00000000045bc86c, expected=0, clockid=<unavailable>, abstime=0x0000000000000000, private=<unavailable>) at futex-internal.c:123
    frame #2: 0x00007ffff7c08540 libpthread.so.0`__pthread_cond_wait at pthread_cond_wait.c:504:10
    frame #3: 0x00007ffff7c08460 libpthread.so.0`__pthread_cond_wait(cond=0x00000000045bc840, mutex=0x00000000045bc800) at pthread_cond_wait.c:628
    frame #4: 0x00000000013cf759 node`uv_cond_wait at thread.c:780:7
    frame #5: 0x00000000013bbc94 node`worker(arg=0x0000000000000000) at threadpool.c:76:7
    frame #6: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007fffe7fff640) at pthread_create.c:473:8
    frame #7: 0x00007ffff7b24d53 libc.so.6`__clone + 67
  thread #11, name = 'node'
    frame #0: 0x00007ffff7c0f31e libpthread.so.0`__GI___futex_abstimed_wait_cancelable64 at futex-internal.c:74:11
    frame #1: 0x00007ffff7c0f2d0 libpthread.so.0`__GI___futex_abstimed_wait_cancelable64(futex_word=0x00000000045bc86c, expected=0, clockid=<unavailable>, abstime=0x0000000000000000, private=<unavailable>) at futex-internal.c:123
    frame #2: 0x00007ffff7c08540 libpthread.so.0`__pthread_cond_wait at pthread_cond_wait.c:504:10
    frame #3: 0x00007ffff7c08460 libpthread.so.0`__pthread_cond_wait(cond=0x00000000045bc840, mutex=0x00000000045bc800) at pthread_cond_wait.c:628
    frame #4: 0x00000000013cf759 node`uv_cond_wait at thread.c:780:7
    frame #5: 0x00000000013bbc94 node`worker(arg=0x0000000000000000) at threadpool.c:76:7
    frame #6: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007fffe77fe640) at pthread_create.c:473:8
    frame #7: 0x00007ffff7b24d53 libc.so.6`__clone + 67
* thread #12, name = 'node', stop reason = signal SIGSEGV: invalid address (fault address: 0x7fffe5f2f300)
  * frame #0: 0x00007fffe5f2f300
    frame #1: 0x00007ffff7c01431 libpthread.so.0`__nptl_deallocate_tsd at pthread_create.c:303:8
    frame #2: 0x00007ffff7c02471 libpthread.so.0`start_thread [inlined] __nptl_deallocate_tsd at pthread_create.c:258:6
    frame #3: 0x00007ffff7c0245e libpthread.so.0`start_thread(arg=0x00007fffe6ffd640) at pthread_create.c:484
    frame #4: 0x00007ffff7b24d53 libc.so.6`__clone + 67
  thread #13, name = 'node'
    frame #0: 0x00007ffff7b2509e libc.so.6`epoll_wait + 94
    frame #1: 0x00000000013d3197 node`uv__io_poll at epoll.c:236:14
    frame #2: 0x00000000013c13d8 node`uv_run(loop=0x00007fffe6bfba78, mode=UV_RUN_DEFAULT) at core.c:389:5
    frame #3: 0x0000000000afcc4b node`node::worker::Worker::Run() + 4187
    frame #4: 0x0000000000afd6f8 node`node::worker::Worker::StartThread(v8::FunctionCallbackInfo<v8::Value> const&)::'lambda'(void*)::_FUN(void*) + 56
    frame #5: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007fffe6bfc640) at pthread_create.c:473:8
    frame #6: 0x00007ffff7b24d53 libc.so.6`__clone + 67
  thread #14, name = 'node'
    frame #0: 0x0000000000e63280 node`v8::internal::interpreter::TryFinallyBuilder::BeginTry(v8::internal::interpreter::Register)
    frame #1: 0x0000000000e58c9e node`v8::internal::interpreter::BytecodeGenerator::VisitForOfStatement(v8::internal::ForOfStatement*) + 942
    frame #2: 0x0000000000e51d04 node`v8::internal::interpreter::BytecodeGenerator::VisitStatements(v8::internal::ZoneList<v8::internal::Statement*> const*) + 84
    frame #3: 0x0000000000e51e5b node`v8::internal::interpreter::BytecodeGenerator::VisitBlockDeclarationsAndStatements(v8::internal::Block*) + 171
    frame #4: 0x0000000000e51ed7 node`v8::internal::interpreter::BytecodeGenerator::VisitBlock(v8::internal::Block*) + 71
    frame #5: 0x0000000000e51d04 node`v8::internal::interpreter::BytecodeGenerator::VisitStatements(v8::internal::ZoneList<v8::internal::Statement*> const*) + 84
    frame #6: 0x0000000000e5249a node`v8::internal::interpreter::BytecodeGenerator::GenerateBytecodeBody() + 554
    frame #7: 0x0000000000e527c9 node`v8::internal::interpreter::BytecodeGenerator::GenerateBytecode(unsigned long) + 313
    frame #8: 0x0000000000e64ee0 node`v8::internal::interpreter::InterpreterCompilationJob::ExecuteJobImpl() + 128
    frame #9: 0x0000000000c8238b node`v8::internal::(anonymous namespace)::ExecuteSingleUnoptimizedCompilationJob(v8::internal::ParseInfo*, v8::internal::FunctionLiteral*, v8::internal::AccountingAllocator*, std::vector<v8::internal::FunctionLiteral*, std::allocator<v8::internal::FunctionLiteral*> >*) + 219
    frame #10: 0x0000000000c88598 node`v8::internal::(anonymous namespace)::IterativelyExecuteAndFinalizeUnoptimizedCompilationJobs(v8::internal::Isolate*, v8::internal::Handle<v8::internal::SharedFunctionInfo>, v8::internal::Handle<v8::internal::Script>, v8::internal::ParseInfo*, v8::internal::AccountingAllocator*, v8::internal::IsCompiledScope*, std::vector<v8::internal::FinalizeUnoptimizedCompilationData, std::allocator<v8::internal::FinalizeUnoptimizedCompilationData> >*) (.constprop.390) + 216
    frame #11: 0x0000000000c8b59e node`v8::internal::Compiler::Compile(v8::internal::Handle<v8::internal::SharedFunctionInfo>, v8::internal::Compiler::ClearExceptionFlag, v8::internal::IsCompiledScope*) + 862
    frame #12: 0x0000000000c8d6bc node`v8::internal::Compiler::Compile(v8::internal::Handle<v8::internal::JSFunction>, v8::internal::Compiler::ClearExceptionFlag, v8::internal::IsCompiledScope*) + 268
    frame #13: 0x000000000108658a node`v8::internal::Runtime_CompileLazy(int, unsigned long*, v8::internal::Isolate*) + 218
    frame #14: 0x0000000001448df9 node`Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_NoBuiltinExit + 57
    frame #15: 0x00000000013e0310 node`Builtins_CompileLazy + 848
    frame #16: 0x00000000013df922 node`Builtins_InterpreterEntryTrampoline + 194
    frame #17: 0x00000000013d9859 node`Builtins_ArgumentsAdaptorTrampoline + 185
    frame #18: 0x000000000145affe node`Builtins_ArrayForEach + 638
    frame #19: 0x00000000013df922 node`Builtins_InterpreterEntryTrampoline + 194
    frame #20: 0x00000000013dd63a node`Builtins_JSEntryTrampoline + 90
    frame #21: 0x00000000013dd418 node`Builtins_JSEntry + 120
    frame #22: 0x0000000000d049f1 node`v8::internal::(anonymous namespace)::Invoke(v8::internal::Isolate*, v8::internal::(anonymous namespace)::InvokeParams const&) + 449
    frame #23: 0x0000000000d0585f node`v8::internal::Execution::Call(v8::internal::Isolate*, v8::internal::Handle<v8::internal::Object>, v8::internal::Handle<v8::internal::Object>, int, v8::internal::Handle<v8::internal::Object>*) + 95
    frame #24: 0x0000000000bcc5ac node`v8::Function::Call(v8::Local<v8::Context>, v8::Local<v8::Value>, int, v8::Local<v8::Value>*) + 332
    frame #25: 0x00000000009926c4 node`node::InitializePrimordials(v8::Local<v8::Context>) + 452
    frame #26: 0x0000000000992858 node`node::GetPerContextExports(v8::Local<v8::Context>) + 232
    frame #27: 0x00000000009925e8 node`node::InitializePrimordials(v8::Local<v8::Context>) + 232
    frame #28: 0x000000000099294d node`node::NewContext(v8::Isolate*, v8::Local<v8::ObjectTemplate>) + 93
    frame #29: 0x0000000000afc280 node`node::worker::Worker::Run() + 1680
    frame #30: 0x0000000000afd6f8 node`node::worker::Worker::StartThread(v8::FunctionCallbackInfo<v8::Value> const&)::'lambda'(void*)::_FUN(void*) + 56
    frame #31: 0x00007ffff7c02450 libpthread.so.0`start_thread(arg=0x00007fffe67fb640) at pthread_create.c:473:8
    frame #32: 0x00007ffff7b24d53 libc.so.6`__clone + 67
(lldb) 

@revmischa
Copy link
Author

revmischa commented Jun 15, 2022

There is a child process spawned at the point where the segfault happens:

cyber      31599  1.8  1.6 960224 133068 pts/4   Sl+  08:50   0:00  |   |   \_ lldb node -- ../../node_modules/.bin/vitest
cyber      31610  0.1  0.4 137816 39992 pts/4    S    08:50   0:00  |   |       \_ /usr/lib/llvm-11/bin/lldb-server-11.0.1 gdbserver --fd=7 --native-regs --setsid
cyber      31617 16.0  2.0 43479200 167272 pts/4 tl   08:50   0:03  |   |           \_ /usr/bin/node ../../node_modules/.bin/vitest
cyber      31639  0.1  0.1 712272 11956 pts/4    Sl   08:50   0:00  |   |               \_ /home/cyber/dev/platform/node_modules/esbuild/lib/downloaded-esbuild-linux-64-esbuild --service=0.14.38 --ping

If I attach to the child process it shows:

sudo lldb -p 31639                          
(lldb) process attach --pid 31639
Process 31639 stopped
* thread #1, name = 'downloaded-esbu', stop reason = signal SIGSTOP
    frame #0: 0x000000000049519b downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild + 606619
downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild:
->  0x49519b <+606619>: cmpq   $-0xfff, %rax             ; imm = 0xF001 
    0x4951a1 <+606625>: jbe    0x4951c3                  ; <+606659>
    0x4951a3 <+606627>: movq   $-0x1, 0x28(%rsp)
    0x4951ac <+606636>: movq   $0x0, 0x30(%rsp)
  thread #2, name = 'downloaded-esbu', stop reason = signal SIGSTOP
    frame #0: 0x00000000004678e3 downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild + 420067
downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild:
->  0x4678e3 <+420067>: movl   %eax, 0x30(%rsp)
    0x4678e7 <+420071>: retq   
    0x4678e8 <+420072>: int3   
    0x4678e9 <+420073>: int3   
  thread #3, name = 'downloaded-esbu', stop reason = signal SIGSTOP
    frame #0: 0x00000000004678e3 downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild + 420067
downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild:
->  0x4678e3 <+420067>: movl   %eax, 0x30(%rsp)
    0x4678e7 <+420071>: retq   
    0x4678e8 <+420072>: int3   
    0x4678e9 <+420073>: int3   
  thread #4, name = 'downloaded-esbu', stop reason = signal SIGSTOP
    frame #0: 0x00000000004678e3 downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild + 420067
downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild:
->  0x4678e3 <+420067>: movl   %eax, 0x30(%rsp)
    0x4678e7 <+420071>: retq   
    0x4678e8 <+420072>: int3   
    0x4678e9 <+420073>: int3   
  thread #5, name = 'downloaded-esbu', stop reason = signal SIGSTOP
    frame #0: 0x00000000004678e3 downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild + 420067
downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild:
->  0x4678e3 <+420067>: movl   %eax, 0x30(%rsp)
    0x4678e7 <+420071>: retq   
    0x4678e8 <+420072>: int3   
    0x4678e9 <+420073>: int3   
  thread #6, name = 'downloaded-esbu', stop reason = signal SIGSTOP
    frame #0: 0x00000000004678e3 downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild + 420067
downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild:
->  0x4678e3 <+420067>: movl   %eax, 0x30(%rsp)
    0x4678e7 <+420071>: retq   
    0x4678e8 <+420072>: int3   
    0x4678e9 <+420073>: int3   
  thread #7, name = 'downloaded-esbu', stop reason = signal SIGSTOP
    frame #0: 0x00000000004678e3 downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild + 420067
downloaded-esbuild-linux-64-esbuild`___lldb_unnamed_symbol1$$downloaded-esbuild-linux-64-esbuild:
->  0x4678e3 <+420067>: movl   %eax, 0x30(%rsp)
    0x4678e7 <+420071>: retq   
    0x4678e8 <+420072>: int3   
    0x4678e9 <+420073>: int3   

Executable module set to "/home/cyber/dev/platform/node_modules/esbuild/lib/downloaded-esbuild-linux-64-esbuild".
Architecture set to: x86_64-pc-linux-gnu.

@revmischa revmischa changed the title Segfault on GitHub Actions Segfault on Linux Jun 20, 2022
@revmischa
Copy link
Author

I found that upgrading my @aws-sdk/* deps fixed this issue

@Aslemammad
Copy link
Member

@revmischa Thank you for collaborating in discord!

@revmischa revmischa mentioned this issue Nov 18, 2022
@github-actions github-actions bot locked and limited conversation to collaborators Jun 16, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants