Skip to content

Fix ragged paged attention v2 to resolve the recompilation issue#8797

Merged
yaochengji merged 3 commits intomasterfrom
chengji/ragged_attn_fix
Mar 6, 2025
Merged

Fix ragged paged attention v2 to resolve the recompilation issue#8797
yaochengji merged 3 commits intomasterfrom
chengji/ragged_attn_fix

Conversation

@yaochengji
Copy link
Copy Markdown
Collaborator

No description provided.

@yaochengji yaochengji requested a review from vanbasten23 March 5, 2025 19:29
@yaochengji
Copy link
Copy Markdown
Collaborator Author

@bythew3i could you take a look?

Comment thread torch_xla/experimental/pallas_kernels/ragged_paged_attention_v2.py
Comment thread torch_xla/experimental/custom_kernel.py
Comment thread torch_xla/experimental/custom_kernel.py
Copy link
Copy Markdown
Contributor

@bythew3i bythew3i left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Non-blocking comments.

Comment thread torch_xla/experimental/custom_kernel.py
Comment thread torch_xla/experimental/custom_kernel.py
Comment thread torch_xla/experimental/pallas_kernels/ragged_paged_attention_v2.py
Comment thread torch_xla/experimental/custom_kernel.py
Comment thread torch_xla/experimental/custom_kernel.py
@yaochengji yaochengji merged commit d06a9c9 into master Mar 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants