Shorten the conversation tests for speed + fixing position overflows#26960
Shorten the conversation tests for speed + fixing position overflows#26960Rocketknight1 merged 11 commits intomainfrom
Conversation
|
The documentation is not available anymore as the PR was closed or merged. |
|
Hi @Rocketknight1 . Thanks a lot. I think it's probably better to simply skip this test for blenderbot small. Could you remind me why the test was never failing before? |
|
@ydshieh the test never failed before because the |
|
Thanks. Then let's just skip it for this model. You can search |
|
@ydshieh are you sure? I think the test makes more sense with very short outputs anyway - we don't need 20 tokens to confirm the pipeline works! |
|
That's true, but making it this way also hide the fact of some models fail this test. Having a skip make it explicit. We can definitely use |
|
BlenderBotSmall can pass the test! The problem only arises because the test |
|
Hey! Let's just skip it for now it's blocking a few prs! |
|
I'll merge #27013 so you have time to work on the fix |
e26da79 to
f44f2af
Compare
|
cc @ydshieh @ArthurZucker this should be ready to go now! I've skipped the test for I also increased |
|
Quick ping again @ydshieh @ArthurZucker! This PR should be ready to go |
| hidden_dropout_prob=0.1, | ||
| attention_probs_dropout_prob=0.1, | ||
| max_position_embeddings=20, | ||
| max_position_embeddings=50, |
There was a problem hiding this comment.
This change won't be reflect on the tiny models uploaded to Hub (hf-internal-testing). So the pipeline testing will still use the previous value. We still skip the test, so it is fine, but the change here has no effect.
There was a problem hiding this comment.
(the above is just a comment, not a request to change)
| @unittest.skip("Tiny random model has too few position embeddings for this.") | ||
| def test_pipeline_conversational(self): | ||
| pass |
There was a problem hiding this comment.
Let's put this back to is_pipeline_test_to_skip (above) and just remove # TODO @Rocketnight1 to fix with the comment Tiny random model has too few position embeddings for this.
| @unittest.skip("Tiny random model has too few position embeddings for this.") | ||
| def test_pipeline_conversational(self): | ||
| pass |
| @unittest.skip("Tiny random model has too few position embeddings for this.") | ||
| def test_pipeline_conversational(self): | ||
| pass |
| def run_pipeline_test(self, conversation_agent, _): | ||
| # Simple | ||
| outputs = conversation_agent(Conversation("Hi there!"), max_new_tokens=20) | ||
| outputs = conversation_agent(Conversation("Hi there!"), max_new_tokens=5) |
|
Damm, I made review but forgot to commit the comments. Sorry |
|
@ydshieh Changes made as you suggested - I used |
tests/models/blenderbot_small/test_modeling_blenderbot_small.py
Outdated
Show resolved
Hide resolved
tests/models/blenderbot_small/test_modeling_flax_blenderbot_small.py
Outdated
Show resolved
Hide resolved
tests/models/blenderbot_small/test_modeling_tf_blenderbot_small.py
Outdated
Show resolved
Hide resolved
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
…all.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
…l.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
…uggingface#26960) * Shorten the conversation tests for speed + fixing position overflows * Put max_new_tokens back to 5 * Remove test skips * Increase max_position_embeddings in blenderbot tests * Add skips for blenderbot_small * Correct TF test skip * make fixup * Reformat skips to use is_pipeline_test_to_skip * Update tests/models/blenderbot_small/test_modeling_blenderbot_small.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> * Update tests/models/blenderbot_small/test_modeling_flax_blenderbot_small.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> * Update tests/models/blenderbot_small/test_modeling_tf_blenderbot_small.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> --------- Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
Some of the conversation tests were still overflowing the maximum position embeddings for very short models like
BlenderBotSmall. I reduced the limits a lot, which also speeds up those tests! cc @ydshieh