Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug Report]: A Large Config Array Uses All System Resources #2431

Closed
beeequeue opened this issue Mar 22, 2023 · 20 comments
Closed

[Bug Report]: A Large Config Array Uses All System Resources #2431

beeequeue opened this issue Mar 22, 2023 · 20 comments
Assignees
Labels
bug Something isn't working

Comments

@beeequeue
Copy link

beeequeue commented Mar 22, 2023

System Info

  System:
    OS: macOS 12.6.3
    CPU: (10) arm64 Apple M1 Max
    Memory: 15.62 GB / 64.00 GB
    Shell: 5.8.1 - /bin/zsh
  Binaries:
    Node: 16.19.1 - ~/.local/share/rtx/installs/nodejs/16.19.1/bin/node
    Yarn: 1.22.19 - ~/.local/share/rtx/installs/nodejs/16.19.1/bin/yarn
    npm: 8.19.3 - ~/.local/share/rtx/installs/nodejs/16.19.1/bin/npm
  Browsers:
    Chrome: 110.0.5481.177
    Firefox: 110.0.1
    Safari: 16.3
  npmPackages:
    @rspack/cli: latest => 0.1.2

Details

At some point when using an array of configurations (7 when I first encountered it, 10 in the repro repo) the build slows to a crawl, making the build take 10x as long.

Here is how the repro repo build looks on my M1 Mac 64GB:

Building 1 configs...
build: 288.429ms
Building 2 configs...
build: 297.402ms
Building 3 configs...
build: 395.663ms
Building 4 configs...
build: 317.321ms
Building 5 configs...
build: 339.679ms
Building 6 configs...
build: 414.946ms
Building 7 configs...
build: 586.755ms
Building 8 configs...
build: 564.843ms
Building 9 configs...
build: 831.296ms
Building 10 configs...
build: 9.221s          <-- Right here!
Building 11 configs...
build: 10.404s
Building 12 configs...
build: 7.152s

Reproduce link

https://github.com/BeeeQueue/repro-rspack-config-array

Reproduce Steps

  1. Install deps - pnpm install
  2. Run build - pnpm build
  3. See in logs how at some point the build slows to a crawl
@beeequeue beeequeue added the bug Something isn't working label Mar 22, 2023
@ScriptedAlchemy
Copy link
Contributor

I wonder if this is caused by running out of cores or sharing cores across ten builds. When only nine cores of your 10-core CPU are in use it remains fast. Soon as there's not an open core it jumps significantly

@beeequeue
Copy link
Author

It would make sense, in the project I initially encountered it the entire computer freezes intermittently while at 100% CPU usage

@ScriptedAlchemy
Copy link
Contributor

Not saying this is 100% the case, but in webpack i accidentally ran a parallel build task over module-federation-examples which issues 179 webpack builds at once. My computer froze up and was not able to even exit terminal.

Blind guess is that there may need to be a "management" thread for Rust to communicate with JS bindings over. If the CPU is choked it might put some resource limits on Rust-JS transfer as JS, even with multiple processes seems to have a few global limits.

Defiantly worth looking into a little deeper, because i can see a NX monorepo trying to build in parallel and we don't want to impact those common use cases.

@beeequeue beeequeue changed the title [Bug Report]: A Large Config Array Is Abnormally Slow [Bug Report]: A Large Config Array Uses All System Resources Mar 26, 2023
@hardfist hardfist added the P0 label Mar 27, 2023
@Boshen
Copy link
Contributor

Boshen commented Mar 30, 2023

@beeequeue I found a workaround and I need your assistance to verify if this works.

Inside node_modules/@rspack/core/dist/multiCompiler.js,
add options.parallelism = 1; below line 29:

class MultiCompiler {
    constructor(compilers, options) {
        options.parallelism = 1;

And comment out line 444

                if (result) {
                    node.result = undefined;
                    // stats.push(result);
                }

There are two problems at hand:

  • Running the multi compiler in parallel will cause Rspack to go haywire on the Rust side
  • Sending a huge stats array through napi will cause it to crash

@beeequeue
Copy link
Author

From my testing the only change that is needed is the parallelism option, I set it to Math.round(require("os").cpus().length / 2) and it seems to be working as intended still.

Not removing the stats push still seems to work for me.

@Boshen
Copy link
Contributor

Boshen commented May 12, 2023

This should be fixed by the referenced issues and PRs, feel free to reopen if this is still happening.

@Boshen Boshen closed this as completed May 12, 2023
@beeequeue
Copy link
Author

From my quick testing it has not fixed it.

@Boshen Boshen reopened this May 16, 2023
@stale stale bot removed the stale label May 16, 2023
@Boshen
Copy link
Contributor

Boshen commented May 16, 2023

From my quick testing it has not fixed it.

Do you mind updating your repro so I can take a quick look?

@beeequeue
Copy link
Author

It had already been updated but I updated it to the latest version again and there's still no difference.

Running it locally:

Details
repro-rspack-config-array main |  v18.16.0  took 23s
❯ nr build

> rspack-react-starter@0.0.0 build /Users/ahaglund/projects/repro-rspack-config-array
> for i in {1..12}; do AMOUNT="$i" rspack build; done

Building 1 configs...
Time: 135ms
Building 2 configs...
Time: 157ms

Time: 151ms
Building 3 configs...
Time: 143ms

Time: 134ms

Time: 138ms
Building 4 configs...
Time: 172ms

Time: 170ms

Time: 169ms

Time: 160ms
Building 5 configs...
Time: 171ms

Time: 161ms

Time: 157ms

Time: 164ms

Time: 163ms
Building 6 configs...
Time: 204ms

Time: 209ms

Time: 219ms

Time: 190ms

Time: 201ms

Time: 213ms
Building 7 configs...
Time: 273ms

Time: 279ms

Time: 231ms

Time: 224ms

Time: 331ms

Time: 224ms

Time: 280ms
Building 8 configs...
Time: 374ms

Time: 305ms

Time: 397ms

Time: 353ms

Time: 335ms

Time: 385ms

Time: 364ms

Time: 347ms
Building 9 configs...
Time: 822ms

Time: 768ms

Time: 842ms

Time: 797ms

Time: 760ms

Time: 799ms

Time: 783ms

Time: 788ms

Time: 794ms
Building 10 configs...
Time: 6569ms

Time: 6812ms

Time: 6812ms

Time: 6837ms

Time: 6788ms

Time: 6763ms

Time: 6763ms

Time: 6834ms

Time: 6754ms

Time: 6843ms
Building 11 configs...
Time: 4480ms

Time: 5780ms

Time: 5686ms

Time: 5812ms

Time: 5784ms

Time: 5730ms

Time: 5555ms

Time: 5787ms

Time: 5647ms

Time: 5728ms

Time: 5731ms
Building 12 configs...
Time: 12812ms

Time: 14080ms

Time: 14280ms

Time: 14315ms

Time: 14165ms

Time: 14290ms

Time: 14225ms

Time: 14350ms

Time: 14283ms

Time: 14200ms

Time: 14341ms

Time: 14345ms

@Boshen
Copy link
Contributor

Boshen commented May 16, 2023

Relates to #3169

beeequeue added a commit to beeequeue/repro-rspack-config-array that referenced this issue May 16, 2023
@beeequeue
Copy link
Author

The workaround/fix from before still works as well
beeequeue/repro-rspack-config-array#1

@stale
Copy link

stale bot commented Jul 17, 2023

This issue has been automatically marked as stale because it has not had recent activity. If this issue is still affecting you, please leave any comment (for example, "bump"). We are sorry that we haven't been able to prioritize it yet. If you have any new additional information, please include it with your comment!

@stale stale bot added the stale label Jul 17, 2023
@beeequeue
Copy link
Author

Bump

@stale stale bot removed the stale label Jul 17, 2023
@stale
Copy link

stale bot commented Sep 15, 2023

This issue has been automatically marked as stale because it has not had recent activity. If this issue is still affecting you, please leave any comment (for example, "bump"). We are sorry that we haven't been able to prioritize it yet. If you have any new additional information, please include it with your comment!

@stale stale bot added the stale label Sep 15, 2023
@hardfist hardfist removed this from the Working in progress milestone Oct 4, 2023
@stale stale bot removed the stale label Oct 4, 2023
Copy link

stale bot commented Dec 15, 2023

This issue has been automatically marked as stale because it has not had recent activity. If this issue is still affecting you, please leave any comment (for example, "bump"). We are sorry that we haven't been able to prioritize it yet. If you have any new additional information, please include it with your comment!

@stale stale bot added the stale label Dec 15, 2023
@Boshen Boshen removed their assignment Feb 5, 2024
@stale stale bot removed the stale label Feb 5, 2024
Copy link

stale bot commented Apr 5, 2024

This issue has been automatically marked as stale because it has not had recent activity. If this issue is still affecting you, please leave any comment (for example, "bump"). We are sorry that we haven't been able to prioritize it yet. If you have any new additional information, please include it with your comment!

@stale stale bot added the stale label Apr 5, 2024
@h-a-n-a
Copy link
Contributor

h-a-n-a commented Apr 25, 2024

This is related to the parallelism to the related tasks of the module graph creations. webpack's limit is 100 by default. We should support this as well.

@stale stale bot removed the stale label Apr 25, 2024
@t0dorakis
Copy link

Hi there! Is a there a known workaround for this, so some kind of limit?
Our build peaks at a usage of more then 30gb (8gb in webpack).
We are using an array of 11 configs 🗡️

@beeequeue
Copy link
Author

In my repro repo it seems that the issue has been resolved.
The build times increase in a logical manner the more builds you have going on at once.

@hardfist
Copy link
Contributor

hardfist commented Jul 4, 2024

seems we can close this issue now

@hardfist hardfist closed this as completed Jul 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

8 participants