Skip to content

Fix PerceiverModelIntegrationTest::test_inference_masked_lm#26760

Merged
LysandreJik merged 1 commit intomainfrom
fix_perceiver
Oct 12, 2023
Merged

Fix PerceiverModelIntegrationTest::test_inference_masked_lm#26760
LysandreJik merged 1 commit intomainfrom
fix_perceiver

Conversation

@ydshieh
Copy link
Collaborator

@ydshieh ydshieh commented Oct 12, 2023

What does this PR do?

The PR #23909 changed the result of vocab_size of

tokenizer = PerceiverTokenizer.from_pretrained("deepmind/language-perceiver")

from 262 to 256, but the logit has shape [1, 2048, 262].

Let's use len here.

To reproduce:

from transformers import PerceiverTokenizer

tokenizer = PerceiverTokenizer.from_pretrained("deepmind/language-perceiver")
# 256 on `2da88537` but `262` on one commit before (`835b0a05`)
print(tokenizer.vocab_size)

@ydshieh ydshieh requested a review from LysandreJik October 12, 2023 13:26
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Oct 12, 2023

The documentation is not available anymore as the PR was closed or merged.

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct change!

@LysandreJik LysandreJik merged commit a243cdc into main Oct 12, 2023
@LysandreJik LysandreJik deleted the fix_perceiver branch October 12, 2023 15:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants