Skip to content

[BUG] Fix bug in cast in quantization#481

Merged
vadiklyutiy merged 3 commits intomainfrom
vadim/quant-cast
Dec 21, 2024
Merged

[BUG] Fix bug in cast in quantization#481
vadiklyutiy merged 3 commits intomainfrom
vadim/quant-cast

Conversation

@vadiklyutiy
Copy link
Copy Markdown
Collaborator

No description provided.

@vadiklyutiy vadiklyutiy self-assigned this Dec 21, 2024
@vadiklyutiy vadiklyutiy merged commit c0525b7 into main Dec 21, 2024
@vadiklyutiy vadiklyutiy deleted the vadim/quant-cast branch December 21, 2024 02:02
vadiklyutiy added a commit that referenced this pull request Dec 21, 2024
vadiklyutiy added a commit that referenced this pull request Dec 21, 2024
vadiklyutiy pushed a commit that referenced this pull request Dec 26, 2024
Changing QxK^T accumulator from fp16 to fp32 in causal attention.
Previously it solved the accuracy issue in masked attention:
https://github.com/CentML/hidet/issues/465

---------

Co-authored-by: Zhumakhan <nazirzhumakhan@gmail,.com>
AndreSlavescu pushed a commit to AndreSlavescu/hidet that referenced this pull request May 31, 2025
Changing QxK^T accumulator from fp16 to fp32 in causal attention.
Previously it solved the accuracy issue in masked attention:
CentML/hidet#465

---------

Co-authored-by: Zhumakhan <nazirzhumakhan@gmail,.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant