-
-
Notifications
You must be signed in to change notification settings - Fork 144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mark unit test as flaky #770
Conversation
Codecov Report
@@ Coverage Diff @@
## develop #770 +/- ##
===========================================
+ Coverage 87.57% 87.69% +0.12%
===========================================
Files 36 36
Lines 4072 4072
===========================================
+ Hits 3566 3571 +5
+ Misses 506 501 -5
Continue to review full report at Codecov.
|
I'm sorry, I am confused about this.. The test was already marked as flaky? What does it mean to use the default value for |
One of them, the other one is newly marked. I think nothing happened though because the flaky plugin was not yet installed
There exists no All in all, I think this PR is an improvement over the previous state. If the unit tests continue to fail, we can increase the number of retries even further. |
To make sure I understand it right, what you are saying is that the previous |
From my understanding of the pytest docs, pytest by itself does not provide functionality to rerun flaky tests by itself. Therefore, I concluded that it did not do anything and we need the flaky plugin. |
I'll trust your judgement on that. Seems weird but also probably correct. |
This PR marks one constantly failing unit test as flaky which often fails under high test server load (i.e. multiple travis runs at the same time). Also, it install the flaky package which is a plugin to pytest to actually rerun the marked unit test.