Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[v2] npm start, build fails with JavaScript out of memory because of huge files #4785

Closed
idontknowjs opened this issue May 13, 2021 · 7 comments
Labels
bug An error in the Docusaurus core causing instability or issues with its execution closed: duplicate This issue or pull request already exists in another issue or pull request

Comments

@idontknowjs
Copy link
Contributor

πŸ› Bug Report

npm start crashes with JavaScript out of memory error after 42%.
npm run build crashes with JavaScript out of memory error after 23%.

Error logs for npm start:

* Client β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ building (42%) 3/5 entries 746/800 dependencies 92/379 modules 79 active 
 babel-loader Β» mdx-loader Β» docs\contribute\guidelines-ui\icon.md

i ο½’wdsο½£: Project is running at http://localhost:3000/
i ο½’wdsο½£: webpack output is served from /
i ο½’wdsο½£: Content not from webpack is served from ...
i ο½’wdsο½£: 404s will fallback to /index.html

<--- Last few GCs --->

[16236:000001C176447760]   186754 ms: Mark-sweep (reduce) 2034.7 (2052.1) -> 2034.3 (2053.1) MB, 4644.7 / 0.1 ms  (average mu = 0.089, current mu = 0.002) allocation failure scavenge might not succeed
[16236:000001C176447760]   192112 ms: Mark-sweep (reduce) 2035.3 (2055.1) -> 2034.8 (2055.8) MB, 5350.2 / 0.1 ms  (average mu = 0.041, current mu = 0.001) allocation failure scavenge might not succeed


<--- JS stacktrace --->

FATAL ERROR: MarkCompactCollector: young object promotion failed Allocation failed - JavaScript heap out of memory
 1: 00007FF70D36046F napi_wrap+109311
 2: 00007FF70D305156 v8::internal::OrderedHashTable<v8::internal::OrderedHashMap,2>::NumberOfElementsOffset+33302
 3: 00007FF70D305F26 node::OnFatalError+294
 4: 00007FF70DBD2B4E v8::Isolate::ReportExternalAllocationLimitReached+94
 5: 00007FF70DBB792D v8::SharedArrayBuffer::Externalize+781
 6: 00007FF70DA61CCC v8::internal::Heap::EphemeronKeyWriteBarrierFromCode+1516
 7: 00007FF70DA4C86B v8::internal::NativeContextInferrer::Infer+59451
 8: 00007FF70DA31CFF v8::internal::MarkingWorklists::SwitchToContextSlow+56991
 9: 00007FF70DA4591B v8::internal::NativeContextInferrer::Infer+30955
10: 00007FF70DA3CA3D v8::internal::MarkCompactCollector::EnsureSweepingCompleted+6269
11: 00007FF70DA44B6E v8::internal::NativeContextInferrer::Infer+27454
12: 00007FF70DA48B2B v8::internal::NativeContextInferrer::Infer+43771
13: 00007FF70DA52472 v8::internal::ItemParallelJob::Task::RunInternal+18
14: 00007FF70DA52401 v8::internal::ItemParallelJob::Run+641
15: 00007FF70DA25C63 v8::internal::MarkingWorklists::SwitchToContextSlow+7683
16: 00007FF70DA3CEEC v8::internal::MarkCompactCollector::EnsureSweepingCompleted+7468
17: 00007FF70DA3B734 v8::internal::MarkCompactCollector::EnsureSweepingCompleted+1396
18: 00007FF70DA392B8 v8::internal::MarkingWorklists::SwitchToContextSlow+87128
19: 00007FF70DA67A91 v8::internal::Heap::LeftTrimFixedArray+929
20: 00007FF70DA69B75 v8::internal::Heap::PageFlagsAreConsistent+789
21: 00007FF70DA5EDE1 v8::internal::Heap::CollectGarbage+2033
22: 00007FF70DA5D005 v8::internal::Heap::AllocateExternalBackingStore+1317
23: 00007FF70DA7D2A7 v8::internal::Factory::NewFillerObject+183
24: 00007FF70D7ACC31 v8::internal::interpreter::JumpTableTargetOffsets::iterator::operator=+1409
25: 00007FF70DC5B50D v8::internal::SetupIsolateDelegate::SetupHeap+463949
26: 00007FF70DC35A68 v8::internal::SetupIsolateDelegate::SetupHeap+309672
27: 00007FF70DBEFDD1 v8::internal::SetupIsolateDelegate::SetupHeap+23825
28: 0000021D94260858

Error logs for npm run build:

[en] Creating an optimized production build...

* Client β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ building (23%) 0/1 entries 1358/1378 dependencies 115/433 modules 140 active
 babel-loader Β» mdx-loader Β» docs\troubleshoot\faq.md

* Server β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ building (35%) 0/1 entries 1302/1423 dependencies 303/600 modules 82 active
 babel-loader Β» mdx-loader Β» docs\contribute\guidelines-ui\ui.md


<--- Last few GCs --->

[14572:000001F439CA0710]   163441 ms: Mark-sweep (reduce) 2032.7 (2054.4) -> 2032.2 (2054.9) MB, 4677.5 / 0.1 ms  (average mu = 0.154, current mu = 0.128) allocation failure scavenge might not succeed
[14572:000001F439CA0710]   168904 ms: Mark-sweep (reduce) 2033.2 (2051.9) -> 2032.8 (2052.9) MB, 4305.5 / 0.1 ms  (average mu = 0.184, current mu = 0.212) allocation failure scavenge might not succeed


<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 00007FF70D36046F napi_wrap+109311
 2: 00007FF70D305156 v8::internal::OrderedHashTable<v8::internal::OrderedHashMap,2>::NumberOfElementsOffset+33302
 3: 00007FF70D305F26 node::OnFatalError+294
 4: 00007FF70DBD2B4E v8::Isolate::ReportExternalAllocationLimitReached+94
 5: 00007FF70DBB792D v8::SharedArrayBuffer::Externalize+781
 6: 00007FF70DA61CCC v8::internal::Heap::EphemeronKeyWriteBarrierFromCode+1516
 7: 00007FF70DA6D04A v8::internal::Heap::ProtectUnprotectedMemoryChunks+1258
 8: 00007FF70DA6A1F9 v8::internal::Heap::PageFlagsAreConsistent+2457
 9: 00007FF70DA5EDE1 v8::internal::Heap::CollectGarbage+2033
10: 00007FF70DA5D005 v8::internal::Heap::AllocateExternalBackingStore+1317
11: 00007FF70DA7D2A7 v8::internal::Factory::NewFillerObject+183
12: 00007FF70D7ACC31 v8::internal::interpreter::JumpTableTargetOffsets::iterator::operator=+1409
13: 00007FF70DC5B50D v8::internal::SetupIsolateDelegate::SetupHeap+463949
14: 00007FF70DC35A68 v8::internal::SetupIsolateDelegate::SetupHeap+309672
15: 00007FF70DBEFDD1 v8::internal::SetupIsolateDelegate::SetupHeap+23825
16: 000001B9069CD34E

Have you read the Contributing Guidelines on issues?

yes

To Reproduce

  1. git clone https://github.com/covalentbond/docs-site.git
  2. npm install
  3. npm start

Actual Behavior

npm start does not crash and server starts running at port:3000.
npm run build successfully builds the app.

Your Environment

@idontknowjs idontknowjs added bug An error in the Docusaurus core causing instability or issues with its execution status: needs triage This issue has not been triaged by maintainers labels May 13, 2021
@slorber
Copy link
Collaborator

slorber commented May 13, 2021

Have you tried increasing the memory that you allocate with the nodejs process?

Something like this might help for large sites:NODE_OPTIONS="--max-old-space-size=8192"

@idontknowjs
Copy link
Contributor Author

Tried changing the start script with node --max-old-space-size=8192 node_modules/@docusaurus/core/bin/docusaurus start.
But the process still freezes at 21%.

Also a note comes up:

[BABEL] Note: The code generator has deoptimised the styling of \docs-site\docs\appendix\tpsr.md as it 
exceeds the max of 500KB.

@idontknowjs
Copy link
Contributor Author

hey @slorber - yes increasing the memory allocated seems to work!

But due to the file tpsr.md having around 23k lines of nearly 4MB is causing the app server to freeze at a certain percentage. Without this particular file the app is running as well as building fine.

@slorber
Copy link
Collaborator

slorber commented May 17, 2021

We use MDX to parse the markdown content and transform it into React components.

Unfortunately, it is probably difficult to process a file of that size, but there might be other ways to create that page?

If the content is pure markdown, we'll enable later to provide an alternate md parser (#3018), which may be more memory efficient (but also more limited due to the inability to use React inside markdown).

Why does this page need to be so long in the first place? Can it be splitted into multiple pages? Is the content auto-generated?

My feeling is that this page is autogenerated, and generating markdown might not be the best solution in the first place. I would rather:

  • generate the table data as a json file
  • create a TsprTable React component, importing that json file and rendering the table in plain React
  • Use <TsprTable> inside the tspr.md file, leveraging the power of MDX

@adventure-yunfei
Copy link
Contributor

If the content is pure markdown, we'll enable later to provide an alternate md parser (#3018), which may be more memory efficient (but also more limited due to the inability to use React inside markdown).

Will this alternative md parser be the key to make Docusaurus v2 available for large doc site? Or is there any other workaround?

I've met similar problem when switching docusaurus v1 to v2. I have a very, very huge doc site (containing 7000+ md files, auto-generated by *.d.ts as api file). Docusaurus v1 generates the doc site in about one or two minutes. But Docusaurus v2 is unable to generate. The process breaks with a strange error after one hour.

Docusaurus v2 has more amazing features than v1 (actually in my opinion v1 is only "can-use" but missing many configurations). So I'm really looking forward to switching to v2. With some effort, I've resolved the parsing error in md syntax. But stuck at compiling time.

@slorber
Copy link
Collaborator

slorber commented Jun 14, 2021

@adventure-yunfei here we talk about a very large file, not a lot of small files.

Docusaurus 2 compiles each md file to a React component by default (with MDX), so this adds a bit of overhead if you don't need MDX, and allowing a less powerful parser can be useful but also make it easier to adopt for sites using commonmark and not willing to change the docs during the migration.

I suggest opening another issue dedicated to your specific problem. It will be hard to troubleshoot that without a repro that I can run. Note we already have a few very large sites on Docusaurus but yes built time is definitively a pain point (ex: https://xsoar.pan.dev/docs/reference/index)

@Josh-Cena
Copy link
Collaborator

Closing in favor of #4765. We should work hard on reducing memory / build time as I've heard many complaints about it.

@Josh-Cena Josh-Cena added closed: duplicate This issue or pull request already exists in another issue or pull request and removed status: needs triage This issue has not been triaged by maintainers labels Feb 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug An error in the Docusaurus core causing instability or issues with its execution closed: duplicate This issue or pull request already exists in another issue or pull request
Projects
None yet
Development

No branches or pull requests

4 participants