Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix hypothesis health checks #680

Merged
merged 6 commits into from
Mar 12, 2020

Conversation

gonzaponte
Copy link
Collaborator

Every now and then the travis build fails because of data generation being too slow. This is a combination of two things:

  • data_frames strategy is out of our hands
  • the large number of columns increases execution time significantly and can't/shouldn't be reduced

This PR suppresses the HealthCheck for those tests using this strategy.

@Aretno
Copy link
Collaborator

Aretno commented Nov 27, 2019

Actually, since this happens in monitor_functions and those now live outside IC we could just erase the files and forget about their existance.

@gonzaponte
Copy link
Collaborator Author

What do you mean they live outside IC?

@Aretno
Copy link
Collaborator

Aretno commented Nov 27, 2019

Ah, this: https://github.com/nextic/Olivia

So, the kdst functions are no longer used, while the pmaps and rwf all are living as clients instead of inside IC.

@gonzaponte
Copy link
Collaborator Author

gonzaponte commented Nov 27, 2019

Ah, if these functions are zombies, kill 'em all! (but keep the tests in the other repo! :))

@mmkekic
Copy link
Collaborator

mmkekic commented Dec 16, 2019

Shell we then remove the files altogether? @Aretno , @gonzaponte whats the status of this?

@gonzaponte gonzaponte force-pushed the fix_hypothesis_health_checks branch from a9aa6e4 to f48bf1a Compare March 8, 2020 23:12
@gonzaponte
Copy link
Collaborator Author

@jacg, can I ask for your advice here?

Hypothesis is throwing a ton of these messages when running the Travis build (this doesn't happen on my machine):

HypothesisDeprecationWarning: Test took XXX ms to run. In future the default deadline
setting will be 200ms, which will make this an error. You can set deadline to an
explicit value of e.g. 300 to turn tests slower than this into an error, or you can
set it to None to disable this check entirely.

so I attempted to raise the deadline for the specific tests that were throwing the warning. However, after doing so, more and more tests keep popping up with this warning. This makes me think that there are huge fluctuations in the time it takes Travis to run a given test and therefore it doesn't make sense to set this deadline on a test-by-test basis.

Do you recommend disabling the deadline when running in Travis, maybe setting a deadline on the global test-suite execution time? Do you have any other suggestions on how to address this?

@jacg
Copy link
Collaborator

jacg commented Mar 9, 2020

Hmm. I don't really have any words of wisdom on this. It would be helpful to understand what is going on with Travis. I would try disabling the deadlines on Travis, and see how long the tests are taking. If the tests exceed the deadlines relatively modestly, then it might just be best to (take on some technical debt, and) keep them switched off for now, thus putting off the need to understand what is causing this. If the tests turn out to be unacceptably slow, you'll probably have to dig deeper to find out why this is happening. If it gets to that, it might be worthwhile trying it locally in a Travis Docker image.

@gonzaponte gonzaponte force-pushed the fix_hypothesis_health_checks branch from 4720a24 to 04f0821 Compare March 10, 2020 21:59
@gonzaponte
Copy link
Collaborator Author

In the end, I have realized that part of the problem was that the Travis config had hardcoded the number of processors to use (thanks to @mmkekic for her input). This value was higher than the actual number of available processors.

Nonetheless, some tests were still hitting the hypothesis deadline, which will become an error in the future and therefore must be fixed. Execution times vary from machine to machine and Travis seems to be particularly slow. Thus, I have created a new hypothesis profile specifically for Travis with a slightly higher deadline. I have checked and now we don't get those warnings anymore.

We still get a warning about a global timeout for hypothesis, but this feature will disappear in a future release and therefore is not a concern.

As far as I am concern, this PR is ready for review.

@gonzaponte gonzaponte force-pushed the fix_hypothesis_health_checks branch from 04f0821 to 3fb5a6a Compare March 11, 2020 14:52
Copy link
Collaborator

@mmkekic mmkekic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR registers a new hypothesis profile for travis-ci with extended deadline and auto number of processes. Hopefully this will resolve occasional Travis failures due to hypothesis health checks. Also once (if ever) we setup our own CI server ( #661, miguelsimon#3 ) this will not be an issue any more.

@carmenromo carmenromo merged commit aa1e75d into next-exp:master Mar 12, 2020
@gonzaponte gonzaponte deleted the fix_hypothesis_health_checks branch March 12, 2020 15:21
andLaing pushed a commit to andLaing/IC that referenced this pull request May 13, 2020
next-exp#680

[author: gonzaponte]

Every now and then the travis build fails because of data generation
being too slow. This is a combination of two things:

      `data_frames` strategy is out of our hands
      the large number of columns increases execution time significantly and
can't/shouldn't be reduced.
This PR suppresses the HealthCheck for those tests using this
strategy.

[reviewer: mmkekic]

This PR registers a new hypothesis profile for travis-ci with extended
deadline and auto number of processes. Hopefully this will resolve
occasional Travis failures due to hypothesis health checks. Also
once (if ever) we setup our own CI server (next-exp#661, miguelsimon#3) this
will not be an issue any more.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants