-
Notifications
You must be signed in to change notification settings - Fork 27.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Tests common] Fix flaky test #6593
[Tests common] Fix flaky test #6593
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK for the temp fix. It might warrant some deeper investigation though.
Codecov Report
@@ Coverage Diff @@
## master #6593 +/- ##
==========================================
+ Coverage 79.42% 79.43% +0.01%
==========================================
Files 156 156
Lines 28127 28127
==========================================
+ Hits 22339 22343 +4
+ Misses 5788 5784 -4
Continue to review full report at Codecov.
|
Long term, I would rather fill nan with a sentinel value like -11.27 (my negative birthday, dont forget!) and check equality but I am also fine with the temp fix. |
Ok pinging @LysandreJik here that we should take a look next week. Will note it down as well. |
This reverts commit 38846a6.
The test
test_model_outputs_equivalence
fails quite often at the moment because of a problem withnan - nan
. This should solve the issue.Also added more explicit error message in case test error occurs.
Flaky error happens here for example: https://app.circleci.com/pipelines/github/huggingface/transformers/10798/workflows/44e689b2-f4b3-49be-88b3-a5b214eac6c5/jobs/75173