Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory #106

Open
Nasir-S opened this issue Jul 19, 2023 · 1 comment
Labels
bug Something isn't working

Comments

@Nasir-S
Copy link

Nasir-S commented Jul 19, 2023

This issue is similar to #93, except that it throws a JS error.

I am also using this action to post the output of a terraform plan as a PR comment using a txt file.

[3148:0x4bfddd0]     5515 ms: Mark-sweep (reduce) 1992.5 (2058.2) -> 1992.2 (2059.2) MB, 16.7 / 0.0 ms  (average mu = 0.958, current mu = 0.285) allocation failure scavenge might not succeed

<--- JS stacktrace --->

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
 1: 0xb0a860 node::Abort() [/home/runner/runners/2.305.0/externals/node16/bin/node]
 2: 0xa1c193 node::FatalError(char const*, char const*) [/home/runner/runners/2.305.0/externals/node16/bin/node]
 3: 0xcf9a6e v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/home/runner/runners/2.305.0/externals/node16/bin/node]
 4: 0xcf9de7 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/home/runner/runners/2.305.0/externals/node16/bin/node]
 5: 0xeb1685  [/home/runner/runners/2.305.0/externals/node16/bin/node]
 6: 0xec134d v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/home/runner/runners/2.305.0/externals/node16/bin/node]
 7: 0xec404e v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/home/runner/runners/2.305.0/externals/node16/bin/node]
 8: 0xe852c2 v8::internal::Factory::AllocateRaw(int, v8::internal::AllocationType, v8::internal::AllocationAlignment) [/home/runner/runners/2.305.0/externals/node16/bin/node]
 9: 0xe7d8d4 v8::internal::FactoryBase<v8::internal::Factory>::AllocateRawWithImmortalMap(int, v8::internal::AllocationType, v8::internal::Map, v8::internal::AllocationAlignment) [/home/runner/runners/2.305.0/externals/node16/bin/node]
10: 0xe7f931 v8::internal::FactoryBase<v8::internal::Factory>::NewRawTwoByteString(int, v8::internal::AllocationType) [/home/runner/runners/2.305.0/externals/node16/bin/node]
11: 0x125af3c v8::internal::IncrementalStringBuilder::Extend() [/home/runner/runners/2.305.0/externals/node16/bin/node]
12: 0xfa9dbd v8::internal::JsonStringifier::SerializeString(v8::internal::Handle<v8::internal::String>) [/home/runner/runners/2.305.0/externals/node16/bin/node]
13: 0xfa93e9 v8::internal::JsonStringifier::SerializeString(v8::internal::Handle<v8::internal::String>) [/home/runner/runners/2.305.0/externals/node16/bin/node]
14: 0xfabc31 v8::internal::JsonStringifier::Result v8::internal::JsonStringifier::Serialize_<true>(v8::internal::Handle<v8::internal::Object>, bool, v8::internal::Handle<v8::internal::Object>) [/home/runner/runners/2.305.0/externals/node16/bin/node]
15: 0xfaf691 v8::internal::JsonStringifier::Result v8::internal::JsonStringifier::Serialize_<false>(v8::internal::Handle<v8::internal::Object>, bool, v8::internal::Handle<v8::internal::Object>) [/home/runner/runners/2.305.0/externals/node16/bin/node]
16: 0xfb0faf v8::internal::JsonStringify(v8::internal::Isolate*, v8::internal::Handle<v8::internal::Object>, v8::internal::Handle<v8::internal::Object>, v8::internal::Handle<v8::internal::Object>) [/home/runner/runners/2.305.0/externals/node16/bin/node]
17: 0xd7bc97 v8::internal::Builtin_JsonStringify(int, unsigned long*, v8::internal::Isolate*) [/home/runner/runners/2.305.0/externals/node16/bin/node]
18: 0x15f2e19  [/home/runner/runners/2.305.0/externals/node16/bin/node]

Is there any way to address this?

@mshick
Copy link
Owner

mshick commented Aug 3, 2023

Yes, like #93 the action should be made to sniff out large files and truncate to prevent failures.

Out of curiosity, were you hoping to link this large file in the comment? If so, would something like an S3 integration for uploading large files and then linking them be useful to you?

@mshick mshick added the bug Something isn't working label Aug 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants