Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Intermittent issue with Travis build #1742

Closed
alansouzati opened this issue Sep 20, 2016 · 10 comments
Closed

Intermittent issue with Travis build #1742

alansouzati opened this issue Sep 20, 2016 · 10 comments

Comments

@alansouzati
Copy link
Contributor

Sometimes my Travis build is failing with the following log:

  ● Test suite failed to run
    ProcessTerminatedError: cancel after 2 retries!

      at Farm.<anonymous> (node_modules/worker-farm/lib/farm.js:81:25)
      at Array.forEach (native)
      at Farm.<anonymous> (node_modules/worker-farm/lib/farm.js:75:36)
      at tryOnTimeout (timers.js:224:11)
      at Timer.listOnTimeout (timers.js:198:5)

Do you want to request a feature or report a bug?

report a bug

What is the current behavior?

https://travis-ci.org/grommet/grommet/builds/161188855#L579

@cpojer
Copy link
Member

cpojer commented Sep 20, 2016

Can you try to use -i on travis which will run all the tests in one thread instead? I'm thinking this might be a resource problem, like one or many tests are leaking memory or something similar.

@alansouzati
Copy link
Contributor Author

Thanks for the follow up. I will try this option. One question: if this is a memory leak in my code, should I be able to reproduce it locally? I tried running tests 20x in a roll with multiple threads and I could not reproduce the issue. Also, I haven't heard back from any of our community members regarding this intermittent issue in their locals.

Could this be related to timeout or something? One noticeable difference between my local and Travis is how long a test takes to run in the CI. I'm wondering if this could be related to this issue.

@cpojer
Copy link
Member

cpojer commented Sep 20, 2016

Travis in general doesn't have a lot of resources so you might not be able to notice this because your CPU and memory is much better than what a travis vm gives you.

@cpojer
Copy link
Member

cpojer commented Sep 20, 2016

I'm gonna close because there is probably nothing actionable for us and -i has worked in the past. If it turns out to be an issue in Jest and you have a repro, we are happy to help you fix it.

@cpojer cpojer closed this as completed Sep 20, 2016
@aaronabramov
Copy link
Contributor

@alansouzati that happened to me a few times and using -i helped.
the reason this was happening was conflicting subprocesses that i spawned from my jest tests. after switching to -i there was only 1 subprocess running at a time and it fixed the issue. It was not reproducible locally as well

@alansouzati
Copy link
Contributor Author

Thanks for the follow up guys

Sent from my iPhone

On Sep 21, 2016, at 2:25 PM, Dmitrii Abramov notifications@github.com wrote:

@alansouzati that happened to me a few times and using -i helped.
the reason this was happening was conflicting subprocesses that i spawned from my jest tests. after switching to -i there was only 1 subprocess running at a time and it fixed the issue. It was not reproducible locally as well


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.

@terkelg
Copy link

terkelg commented Nov 14, 2016

Thanks @DmitriiAbramov and @cpojer. My travis-build went from being terminated after 16 min to only 1 min 27 sec after adding -i:

script:
  - npm test -- -i

@cpojer
Copy link
Member

cpojer commented Nov 14, 2016

It really depends on what you are doing in your tests and how you are utilizing your resources. If you spawn a lot of child processes in your tests, running Jest with many processes itself might stall the CPU. In this case, -i can improve performance. Also consider things like -w 2 which will set the number of workers to 2 – you can try half the available ones on your system and see if that improves things, for example.

yesmeck added a commit to ant-design/ant-design that referenced this issue Nov 21, 2016
yesmeck added a commit to ant-design/ant-design that referenced this issue Nov 22, 2016
benjycui pushed a commit to ant-design/ant-design that referenced this issue Nov 22, 2016
* Run snapshot testing against all demos

* Split demo tests

* ignore coverage folder

* Upgrade antd-demo-jest

* enable cache

* intergate with coveralls.io

* Add node test

* Set worker to 2

jestjs/jest#1742

* config coverage

* Set default supportServerRender to true
@davidtheclark
Copy link

davidtheclark commented Nov 27, 2016

I've run into this problem in our attempt to migrate stylelint to Jest. I don't understand how it is not an issue that Jest itself needs to address. As an aspiring Jest user, here's what I'm seeing:

  • The default Jest settings don't reliably work in CI builds, making CI that uses Jest unreliable — which is a significant problem.
  • The only documentation suggesting you should be using -i on CI is buried in a closed issue.
  • Even when I do use -i, I'm running into memory allocation failures on Travis that prevent the tests from working as they do locally.

UnwashedMeme added a commit to iDigBio/idigbio-search-api that referenced this issue Jan 13, 2017
We're seeing intermittent failures there; echoed in
jestjs/jest#1742

the recommendation there is to add the `-i` flag which makes it run
all tests inline instead of using a process pool.
kmjennison added a commit to gladly-team/tab that referenced this issue Feb 23, 2018
TravisCI erroring with “ProcessTerminatedError: cancel after 2 retries!”
jestjs/jest#1742
@github-actions
Copy link

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators May 14, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants