Idefics: enable generation tests#34062
Conversation
ArthurZucker
left a comment
There was a problem hiding this comment.
LGTM, thanks for debugging these tests!
| ) | ||
| else: | ||
| # some (V)LLMs have hard requirement on SDPA and thus never return attn | ||
| elif outputs.attentions[0] is not None: |
gante
left a comment
There was a problem hiding this comment.
LGTM, except for those two continue replacements.
(tests 💛 )
tests/generation/test_utils.py
Outdated
| # decoder) | ||
| if config.is_encoder_decoder: | ||
| continue | ||
| self.skipTest("This test is for decoder-only models") |
There was a problem hiding this comment.
We need a continue here :P
With a skip, we skip for all other models in self.all_generative_model_classes, and we might have encoder-decoder and decoder-only models mixed up together in that iterator
There was a problem hiding this comment.
hmm oke, didn't think we could have cases with mixed model types. Was just confused the test was passing for one model while not other, and took me a while to figure out hehe
tests/generation/test_utils.py
Outdated
| model = model_class(config).to(torch_device).eval() | ||
| if "inputs_embeds" not in inspect.signature(model.prepare_inputs_for_generation).parameters.keys(): | ||
| continue | ||
| self.skipTest("This model doesn't accept `inputs_embeds` in generate") |
There was a problem hiding this comment.
Likewise, this also needs to be a continue
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
* add idefics * conflicts after merging main * enable tests but need to fix some * fix tests * no print * fix/skip some slow tests * continue not skip * rebasing broken smth, this is the fix
What does this PR do?
Enables GenerationTesterMixin for Idefics models. Tests are passing for me locally.
There are a few PRs with idefics already and they need to be reviewed/merged in the following order: #33907 -> #34043 -> current PR