Conversation
src/transformers/generation/utils.py
Outdated
There was a problem hiding this comment.
if generation_config.max_length is not the same as the default value (20), let's not change it!!!!
There was a problem hiding this comment.
should we not change has_default_max_length directly? 🤗
There was a problem hiding this comment.
I would rather avoid changing its original definition in this PR as it is also used in several places.
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
ArthurZucker
left a comment
There was a problem hiding this comment.
Thanks, this fixes quite some tests no? Can we make sure with a tiny pipeline that by default we generate more than 20 tokens?
src/transformers/generation/utils.py
Outdated
There was a problem hiding this comment.
should we not change has_default_max_length directly? 🤗
Do you mean a new test case that will |
|
merge it as it will unblock whisper CI (170 -> 20 failures) + (quite) some pipeline tests |
update Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
Revert "Fix Whisper CI (huggingface#34541)" This reverts commit eb81144.
What does this PR do?