Skip to content

Fix splash attention test#8978

Merged
zpcore merged 9 commits intomasterfrom
piz/skip_test
Apr 16, 2025
Merged

Fix splash attention test#8978
zpcore merged 9 commits intomasterfrom
piz/skip_test

Conversation

@zpcore
Copy link
Copy Markdown
Member

@zpcore zpcore commented Apr 15, 2025

Fix splash attention test and add causal option.

@zpcore zpcore marked this pull request as ready for review April 15, 2025 22:34
@zpcore
Copy link
Copy Markdown
Member Author

zpcore commented Apr 15, 2025

I completely get rid of flash attention and use vanilla attention in this case. To test again vanilla attention with segment id, we need to support caucal = True for splash attention kernel.

I still can't figure out why flash attention always fails in the backward sharding in the CI. Flash attention and Splash attention seem have a conflict related to in/out shard map.

@zpcore zpcore requested review from tengyifei and vanbasten23 April 15, 2025 22:38
Comment thread torch_xla/experimental/splash_attention.py Outdated
@zpcore zpcore requested a review from tengyifei April 16, 2025 16:58
Comment thread test/test_multi_queries_paged_attention_kernel.py
@zpcore zpcore enabled auto-merge (squash) April 16, 2025 17:28
@zpcore zpcore merged commit 13155c9 into master Apr 16, 2025
24 checks passed
@zpcore zpcore deleted the piz/skip_test branch April 16, 2025 19:42
@zpcore zpcore mentioned this pull request Apr 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants