Skip to content

Support position_ids as input for flash attention#23

Merged
shadow150519 merged 6 commits intoaccfrom
dev/wtx
Dec 3, 2024
Merged

Support position_ids as input for flash attention#23
shadow150519 merged 6 commits intoaccfrom
dev/wtx

Conversation

@shadow150519
Copy link
Copy Markdown

No description provided.

Comment thread torch_xla/csrc/flash_attention_utils.cpp Outdated
Comment thread torch_xla/csrc/flash_attention_utils.cpp Outdated
Comment thread torch_xla/csrc/flash_attention_utils.cpp Outdated
Comment thread torch_xla/csrc/flash_attention_utils.cpp
Comment thread torch_xla/csrc/flash_attention_utils.cpp Outdated
Comment thread torch_xla/csrc/ops/flash_attention_varlen_position_ids_forward.cpp Outdated
Comment thread torch_xla/csrc/ops/flash_attention_varlen_position_ids_forward.cpp
Comment thread torch_xla/csrc/ops/flash_attention_varlen_forward.cpp Outdated
Comment thread torch_xla/csrc/ops/flash_attention_varlen_position_ids_backward.cpp Outdated
Comment thread torch_xla/csrc/ops/flash_attention_varlen_position_ids_forward.cpp Outdated
@Seventeen17
Copy link
Copy Markdown

👍

@shadow150519 shadow150519 merged commit 5598464 into acc Dec 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants