Skip to content

Conversation

@jeffra
Copy link
Collaborator

@jeffra jeffra commented Jul 29, 2022

When enabled auto cast all inputs to fp16. Assumes the entire model is in fp16 as well, not to be confused with setups where the model composes of mixed dtypes like certain layers in fp32 and others in fp16.

Fixes #2019

@jeffra jeffra merged commit a039e22 into master Jul 30, 2022
@jeffra jeffra deleted the auto-cast branch July 30, 2022 17:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] Error when enabling fp16 config with tasks involving FloatTensor Inputs such as Image Classification, Speech Recognition ...

3 participants