Skip to content

Fix TF Roberta for mixed precision training#11675

Merged
LysandreJik merged 1 commit intohuggingface:masterfrom
jplu:fix-tf-roberta-mixed-precision
May 11, 2021
Merged

Fix TF Roberta for mixed precision training#11675
LysandreJik merged 1 commit intohuggingface:masterfrom
jplu:fix-tf-roberta-mixed-precision

Conversation

@jplu
Copy link
Copy Markdown
Contributor

@jplu jplu commented May 11, 2021

What does this PR do?

This PR fixes the TF Roberta model for mixed precision training and is now aligned with the other models.

Fixes

#11282

Copy link
Copy Markdown
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me, thanks a lot @jplu! @Rocketknight1 do you mind taking a look?

Do other models need to be updated as well?

@Rocketknight1
Copy link
Copy Markdown
Member

It looks good to me too. Thanks for the PR!

@LysandreJik LysandreJik merged commit d9b2862 into huggingface:master May 11, 2021
Iwontbecreative pushed a commit to Iwontbecreative/transformers that referenced this pull request Jul 15, 2021
@jplu jplu deleted the fix-tf-roberta-mixed-precision branch June 13, 2023 14:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants