Skip to content

[core] Fix keep in fp32 silent bug#26484

Closed
younesbelkada wants to merge 7 commits intohuggingface:mainfrom
younesbelkada:fix-module-fp32
Closed

[core] Fix keep in fp32 silent bug#26484
younesbelkada wants to merge 7 commits intohuggingface:mainfrom
younesbelkada:fix-module-fp32

Conversation

@younesbelkada
Copy link
Contributor

@younesbelkada younesbelkada commented Sep 29, 2023

What does this PR do?

Before this PR we were performing a simple check if module_name in key but that lead to some modules silently converted in fp32.

For example instructblip models got their word_embedding layers converted in fp32 because _keep_in_fp32_modules includes "wo" which is contained in the string word_embedding. The fix is to check if module_name in key.split(".")

cc @ydshieh

Related bnb and T5 test all pass

Need to investigate if instructblip tests pass

@younesbelkada younesbelkada marked this pull request as draft September 29, 2023 07:32
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Sep 29, 2023

The documentation is not available anymore as the PR was closed or merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants