Skip to content

Fix and re-enable ConversationalPipeline tests#26907

Merged
Rocketknight1 merged 2 commits intomainfrom
fix_conversation_pipeline_tests
Oct 19, 2023
Merged

Fix and re-enable ConversationalPipeline tests#26907
Rocketknight1 merged 2 commits intomainfrom
fix_conversation_pipeline_tests

Conversation

@Rocketknight1
Copy link
Member

The bug didn't turn out to be too bad - some models just had very short max_position_embeddings in their test configs, which meant the conversation tests generated outputs that were too long. Limiting max_new_tokens seems to have fixed it, but I'm running other tests to be sure!

@Rocketknight1
Copy link
Member Author

Looks good - ready to merge after review!

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe limiting the max length to the position embed would be more full proof!
LGTM

@Rocketknight1
Copy link
Member Author

@ArthurZucker I tried that, but sometimes the tests get very slow when max_position_embeddings is large! Using a small number like 10 or 20 keeps the test quick.

@Rocketknight1 Rocketknight1 merged commit bdbcd5d into main Oct 19, 2023
@Rocketknight1 Rocketknight1 deleted the fix_conversation_pipeline_tests branch October 19, 2023 11:04
EduardoPach pushed a commit to EduardoPach/transformers that referenced this pull request Nov 19, 2023
* Fix and re-enable conversationalpipeline tests

* Fix the batch test so the change only applies to conversational pipeline
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants