Skip to content

GPT Neo configuration needs to be set to use GPT2 tokenizer#10992

Merged
LysandreJik merged 1 commit intomasterfrom
gpt-neo-auto-tokenizer
Mar 31, 2021
Merged

GPT Neo configuration needs to be set to use GPT2 tokenizer#10992
LysandreJik merged 1 commit intomasterfrom
gpt-neo-auto-tokenizer

Conversation

@LysandreJik
Copy link
Copy Markdown
Member

The tokenizer wasn't correctly set and ended up making ~200 slow tests fail. The run in question is here: https://github.com/huggingface/transformers/runs/2232656252?check_suite_focus=true

This PR fixes that!

@LysandreJik LysandreJik requested a review from patil-suraj March 31, 2021 11:55
Copy link
Copy Markdown
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for fixing this!

@LysandreJik LysandreJik merged commit a96edb8 into master Mar 31, 2021
@LysandreJik LysandreJik deleted the gpt-neo-auto-tokenizer branch March 31, 2021 12:03
Iwontbecreative pushed a commit to Iwontbecreative/transformers that referenced this pull request Jul 15, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants