fix test_generated_length_assisted_generation#34935
Conversation
|
@zucchini-nlp can you take a look at this? |
zucchini-nlp
left a comment
There was a problem hiding this comment.
Hey! Can you explain why we need to make a test for max_new_tokens=7 and what is the reason to remove check on length <= 20
Is the test failing for you on any of the PRs?
|
@zucchini-nlp I noticed that this test fails locally for me. Are you able to run it successfully on your end? While reviewing the test, I saw that it calls Additionally, I added a third call to Let me know if this makes sense or if you have any concerns. Would appreciate your approval if it looks alright to you. |
zucchini-nlp
left a comment
There was a problem hiding this comment.
Ah, makes sense now. Indeed the test is failing and weird it has not been causing us troubles before (cc @ydshieh maybe). The issues stems from #34377 and preceding PR, where the default value was changed from max=20 to max_new=20
So I think it is better if you modify the code to out.shape[-1] <= 20 + input_ids.shape[1] and leave a small comment explaining where we got 20 :)
see #34807. TL;DR: test fetcher issue so some tests are not fetched (including this one) Thank you @keyboardAnt for helping and this PR. For this length issue, @gante in our team has opened #34814, but we are still discussing what would be the final call. Therefore I think it would be good not to merge this PR and wait for #34814 to make a decision. |
@ydshieh, thank you for responding! I’m a bit puzzled though—since #34814 doesn’t fix |
|
@keyboardAnt it should fix it by removing extra length added in |
|
Hi @keyboardAnt . IIRC, the issue comes from #34377 which is from #34026 (and later #34617 is involved). This is more of the design we have to make decision, which is still under discussion. Fixing a test while we still need to agree on a choice isn't a super great idea :-). (if it's about fixing something that affects many users' usage, that's another story) |
|
But if our assumption is wrong (the causes of the failure), please correct us🙏 |
There was a problem hiding this comment.
Agreed with the fix: with the newer defaults, we will return input_length + 20 tokens (as opposed to 20 tokens)
Thank you for fixing it @keyboardAnt 🤗
fix test_generated_length_assisted_generation
fix test_generated_length_assisted_generation
fix test_generated_length_assisted_generation
What does this PR do?
This PR fixes and expands the
test_generated_length_assisted_generationtest intest_utils.py.Before submitting
Who can review?