Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix integration slow tests #10670

Merged
merged 2 commits into from
Mar 11, 2021
Merged

Fix integration slow tests #10670

merged 2 commits into from
Mar 11, 2021

Conversation

sgugger
Copy link
Collaborator

@sgugger sgugger commented Mar 11, 2021

What does this PR do?

This PR fixes the following slow tests which are failing because of the change of beahvior in the Embeddings layer in PyTorh 1.8. This is done by adding an attention mask to ignore the padding token and checking a slice that does not contain the padding hidden states.

tests/test_modeling_albert.py::AlbertModelIntegrationTest::test_inference_no_head_absolute_embedding
tests/test_modeling_bert.py::BertModelIntegrationTest::test_inference_no_head_absolute_embedding
tests/test_modeling_bert.py::BertModelIntegrationTest::test_inference_no_head_relative_embedding_key
tests/test_modeling_bert.py::BertModelIntegrationTest::test_inference_no_head_relative_embedding_key_query
tests/test_modeling_convbert.py::ConvBertModelIntegrationTest::test_inference_masked_lm
tests/test_modeling_deberta.py::DebertaModelIntegrationTest::test_inference_no_head
tests/test_modeling_deberta_v2.py::DebertaV2ModelIntegrationTest::test_inference_no_head
tests/test_modeling_distilbert.py::DistilBertModelIntergrationTest::test_inference_no_head_absolute_embedding
tests/test_modeling_electra.py::ElectraModelIntegrationTest::test_inference_no_head_absolute_embedding
tests/test_modeling_squeezebert.py::SqueezeBertModelIntegrationTest::test_inference_classification_head

It also fixes

tests/test_modeling_mbart.py::MBartEnroIntegrationTest::test_enro_generate_batch

that was failing since the change in prepare_seq2seq_batch. For some reason a word is different but it was consistent in PyTorch 1.7/PyTorch 1.8 so I changed the desired target. @patil-suraj if you want to take a closer look, I'll leave it to you.

@sgugger sgugger requested a review from LysandreJik March 11, 2021 18:33
Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good to me, thanks for taking care of it!

@sgugger sgugger merged commit fda703a into master Mar 11, 2021
@sgugger sgugger deleted the fix_integration_slow_tests branch March 11, 2021 18:43
Iwontbecreative pushed a commit to Iwontbecreative/transformers that referenced this pull request Jul 15, 2021
* PoC

* Fix slow tests for the PT1.8 Embedding problem
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants