-
Notifications
You must be signed in to change notification settings - Fork 250
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Process Hanging on large project #1284
Comments
Thanks a lot @Stemirabo for opening this issue. We don't have access to these scale of projects, so this is greatly appreciated. For starters, could you try to reduce the number of concurrent test runners? For example: |
Thanks for the response. Now that I'm back from holiday I've updated to the latest versions of stryker...
I also tried lowering the number of concurrent test runners. I tried lowering the number of Concurrent Test Runners and left it run over the weekend. This was with a smaller number of files to be mutated and it eventually made it to the ETC / progress phase. Unfortunately I didn't have logging turned on for this run (which I'm trying to recreate now, but here is the dump it spit out when it died...
Not sure how helpful this is, but I'll try and update when I recreate with the file logging on. |
I've discussed this case with @simondel . We think the problem is the way we transpile and test the mutants. The process looks like this:
The problem is that transpiling each mutant is generally a lot faster than testing it. So the amount of work keeps piling up. Each transpile result contains the content of one or more js output files. This essentially is a memory leak. @Stemirabo could you please help to verify this for us? I've altered the file "MutationTestExecutor.js" with memory logging. It also logs the amount of transpiled mutants (both on level DEBUG). Would you please replace file |
EDIT: Sorry I missed your update before posting this one... I'll give that a shot and see what happens. Here's the trace log file contents for the previous error dump
|
Ok I ran it again with your modified file like you requested. Same output except it spit out: "DEBUG MutationTestExecutor Transpiled mutants: 1. Memory: 2376.39 MB" after the first mutant test run and then the process died/ran out of memory. Here's the full logfile:
Here's the console dump:
|
This may be fixed with the latest release. Could you try to upgrade to the latest v0.x release of Stryker? |
Hi. Sorry for the slow reply. I've updated to the latest packages...
...and reran the same build with "--fileLogLevel trace --logLevel debug". No noticeable difference on my end. Maybe something in the logs will help you guys though. logfile:
Console Dump
|
I had similar troubles with the popular Prettier code formatting tool, see prettier/prettier#6681 (I limited it to CLI testing for now, as an unfortunate workaround). This was on a Vultr 16-CPU virtual server with 64 GB RAM, not good! |
This seems to help run Stryker on all source files in Prettier: node --max-old-space-size=40000 ./node_modules/.bin/stryker run ref: https://stackoverflow.com/questions/38558989/node-js-heap-out-of-memory It may be possible to use NODE_OPTIONS instead, as documented in the second answer. For further investigation, when I get a chance. |
@brodybits It makes sense tho, since we store quite a lot of data, it may weight a lot. But 1.7GB, I dunno. There should be some tools to check how much data is used. I am really looking for your results! |
Yes but not such a good experience. I am discussing my progress in hopes that some others will start watching, experimenting, and contributing ideas to help us all find a good solution. I have a feeling (and nothing more than a feeling) that Stryker could use some improvement in resource allocation and usage. I think it should be possible to break things into separate worker threads or processes that can work together through promises. I also have a feeling that Node.js could be a little smarter about using the system resources available. I will continue to experiment and explore as much as I can on this behavior. |
@brodybits I like your thinking. If you need some resources, I have found: |
Nice resources, thanks! I am still quite new with Stryker, not sure when I
will get to do much more investigation.
--
Sent from my mobile
|
I guess the easiest way to get into stryker is to play with it just a bit. After doing several commits, and understanding |
as discussed in: stryker-mutator/stryker-js#1284 (comment) ref: https://stackoverflow.com/questions/38558989/node-js-heap-out-of-memory This kind of a bug workaround was needed in some cases, running Stryker on the following artifact for example: src/language-js/printer-estree.js
Hi! I'm closing this issue for now. If it still persists with Stryker version 4, please open a new issue. Thanks! |
Summary
Hi. I'm working on a fairly large Typescript project (~1000 files, ~half being test files) with ~5000 tests and I'm trying to get Stryker running. When I try to add too many files to be mutated the number of mutants gets to a point that the process seems to just sit there grinding away on nothing for hours. The last output I see on screen is "40055 Mutant(s) generated". The Node process doesn't seem to be stuck as it's using up CPU cycles and memory usage changes. I just never get to the next step where it fires up the TestRunners and starts the real work. Do I just have to wait it out?
Even if I try to lower the number of files to be mutated down so that ~20000 mutants are generated it still hangs... the largest number of mutants generated I've seen run is around 13000. The project is private so I can't share much, but hopefully there's a solution. I fully expect this to take a long time. It will be run on a CI instance over the weekend once I can get it working.
Any help would be appreciated.
Stryker config
I've tried with and without the transpiler and I don't see a difference. Aside from being able to do coverageAnalysis, but that has other issues for me.
Stryker environment
System Environment
stryker.log
The text was updated successfully, but these errors were encountered: