Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Tests] fix attention masks in Tests #6621

Merged

Conversation

patrickvonplaten
Copy link
Contributor

@patrickvonplaten patrickvonplaten commented Aug 20, 2020

This PR should fix the flaky test failures of test_modeling_output_equivalence and test_feed_forward_chunking.

I added a new random attention_mask generation function that makes sure that at least one token is attended to.

@patrickvonplaten patrickvonplaten linked an issue Aug 20, 2020 that may be closed by this pull request
@codecov
Copy link

codecov bot commented Aug 20, 2020

Codecov Report

Merging #6621 into master will increase coverage by 0.85%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #6621      +/-   ##
==========================================
+ Coverage   79.16%   80.02%   +0.85%     
==========================================
  Files         156      156              
  Lines       28217    28217              
==========================================
+ Hits        22339    22581     +242     
+ Misses       5878     5636     -242     
Impacted Files Coverage Δ
src/transformers/modeling_tf_xlm.py 18.94% <0.00%> (-74.32%) ⬇️
src/transformers/modeling_tf_flaubert.py 24.53% <0.00%> (-63.81%) ⬇️
src/transformers/modeling_roberta.py 77.37% <0.00%> (-19.71%) ⬇️
src/transformers/file_utils.py 82.18% <0.00%> (-0.26%) ⬇️
src/transformers/modeling_utils.py 88.05% <0.00%> (+0.55%) ⬆️
src/transformers/modeling_tf_utils.py 87.29% <0.00%> (+2.60%) ⬆️
src/transformers/generation_tf_utils.py 86.46% <0.00%> (+2.75%) ⬆️
src/transformers/configuration_t5.py 96.42% <0.00%> (+10.71%) ⬆️
src/transformers/modeling_t5.py 83.83% <0.00%> (+12.21%) ⬆️
src/transformers/modeling_tf_t5.py 90.93% <0.00%> (+64.09%) ⬆️
... and 1 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 573bdb0...d61cbf8. Read the comment docs.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for digging into this and fixing those flaky tests!

@sgugger sgugger merged commit 505f2d7 into huggingface:master Aug 20, 2020
@patrickvonplaten patrickvonplaten deleted the fix_distilbert_flaky_test branch August 20, 2020 17:24
Zigur pushed a commit to Zigur/transformers that referenced this pull request Oct 26, 2020
fabiocapsouza pushed a commit to fabiocapsouza/transformers that referenced this pull request Nov 15, 2020
fabiocapsouza added a commit to fabiocapsouza/transformers that referenced this pull request Nov 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[DistilBert] Flaky tests
2 participants