-
Notifications
You must be signed in to change notification settings - Fork 26.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Shorten the conversation tests for speed + fixing position overflows #26960
Conversation
The documentation is not available anymore as the PR was closed or merged. |
Hi @Rocketknight1 . Thanks a lot. I think it's probably better to simply skip this test for blenderbot small. Could you remind me why the test was never failing before? |
@ydshieh the test never failed before because the |
Thanks. Then let's just skip it for this model. You can search |
@ydshieh are you sure? I think the test makes more sense with very short outputs anyway - we don't need 20 tokens to confirm the pipeline works! |
That's true, but making it this way also hide the fact of some models fail this test. Having a skip make it explicit. We can definitely use |
BlenderBotSmall can pass the test! The problem only arises because the test |
Hey! Let's just skip it for now it's blocking a few prs! |
I'll merge #27013 so you have time to work on the fix |
e26da79
to
f44f2af
Compare
cc @ydshieh @ArthurZucker this should be ready to go now! I've skipped the test for I also increased |
Quick ping again @ydshieh @ArthurZucker! This PR should be ready to go |
@@ -85,7 +85,7 @@ def __init__( | |||
hidden_act="gelu", | |||
hidden_dropout_prob=0.1, | |||
attention_probs_dropout_prob=0.1, | |||
max_position_embeddings=20, | |||
max_position_embeddings=50, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change won't be reflect on the tiny models uploaded to Hub (hf-internal-testing)
. So the pipeline testing will still use the previous value. We still skip the test, so it is fine, but the change here has no effect.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(the above is just a comment, not a request to change)
@unittest.skip("Tiny random model has too few position embeddings for this.") | ||
def test_pipeline_conversational(self): | ||
pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's put this back to is_pipeline_test_to_skip
(above) and just remove # TODO @Rocketnight1 to fix
with the comment Tiny random model has too few position embeddings for this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will do!
@unittest.skip("Tiny random model has too few position embeddings for this.") | ||
def test_pipeline_conversational(self): | ||
pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same
@unittest.skip("Tiny random model has too few position embeddings for this.") | ||
def test_pipeline_conversational(self): | ||
pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same
@@ -77,14 +77,14 @@ def get_test_pipeline(self, model, tokenizer, processor): | |||
|
|||
def run_pipeline_test(self, conversation_agent, _): | |||
# Simple | |||
outputs = conversation_agent(Conversation("Hi there!"), max_new_tokens=20) | |||
outputs = conversation_agent(Conversation("Hi there!"), max_new_tokens=5) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
Damm, I made review but forgot to commit the comments. Sorry |
@ydshieh Changes made as you suggested - I used |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for fixing!
tests/models/blenderbot_small/test_modeling_blenderbot_small.py
Outdated
Show resolved
Hide resolved
tests/models/blenderbot_small/test_modeling_flax_blenderbot_small.py
Outdated
Show resolved
Hide resolved
tests/models/blenderbot_small/test_modeling_tf_blenderbot_small.py
Outdated
Show resolved
Hide resolved
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
…all.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
…l.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
…uggingface#26960) * Shorten the conversation tests for speed + fixing position overflows * Put max_new_tokens back to 5 * Remove test skips * Increase max_position_embeddings in blenderbot tests * Add skips for blenderbot_small * Correct TF test skip * make fixup * Reformat skips to use is_pipeline_test_to_skip * Update tests/models/blenderbot_small/test_modeling_blenderbot_small.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> * Update tests/models/blenderbot_small/test_modeling_flax_blenderbot_small.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> * Update tests/models/blenderbot_small/test_modeling_tf_blenderbot_small.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> --------- Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
Some of the conversation tests were still overflowing the maximum position embeddings for very short models like
BlenderBotSmall
. I reduced the limits a lot, which also speeds up those tests! cc @ydshieh