Skip to content

[ONNX] Fix scaled_dot_product_attention with float scale#135594

Closed
titaiwangms wants to merge 1 commit intopytorch:mainfrom
titaiwangms:titaiwang/fix_scaled_dot
Closed

[ONNX] Fix scaled_dot_product_attention with float scale#135594
titaiwangms wants to merge 1 commit intopytorch:mainfrom
titaiwangms:titaiwang/fix_scaled_dot

Conversation

@titaiwangms
Copy link
Collaborator

Fixes #125158

@pytorch-bot
Copy link

pytorch-bot bot commented Sep 10, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/135594

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (2 Unrelated Failures)

As of commit f982235 with merge base dfb2b66 (image):

FLAKY - The following job failed but was likely due to flakiness present on trunk:

BROKEN TRUNK - The following job failed but was present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the release notes: onnx torch.onnx related changes that should show up in the release notes label Sep 10, 2024
@titaiwangms titaiwangms added topic: bug fixes topic category module: onnx Related to torch.onnx release notes: onnx torch.onnx related changes that should show up in the release notes and removed release notes: onnx torch.onnx related changes that should show up in the release notes labels Sep 10, 2024
@titaiwangms titaiwangms added this to the 2.5.0 milestone Sep 10, 2024
), "is_causal and attn_mask cannot be set at the same time"
assert not enable_gqa, "conversion of scaled_dot_product_attention not implemented if enable_gqa is True"

scale = symbolic_helper._maybe_get_const(scale, "f")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just making sure: does it work for fp16?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The fix seem to work with my FP16 case, I see no warnings either.

@titaiwangms
Copy link
Collaborator Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Sep 10, 2024
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@justinchuby
Copy link
Collaborator

@titaiwangms did you want to cherry pick this as well? I can go under category (2) or (3) I think.

@titaiwangms
Copy link
Collaborator Author

@titaiwangms did you want to cherry pick this as well? I can go under category (2) or (3) I think.

Sure! Looks like people need this.

titaiwangms added a commit to titaiwangms/pytorch that referenced this pull request Sep 11, 2024
malfet pushed a commit that referenced this pull request Sep 12, 2024
Chao1Han pushed a commit to Chao1Han/pytorch that referenced this pull request Sep 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request Merged module: onnx Related to torch.onnx open source release notes: onnx torch.onnx related changes that should show up in the release notes topic: bug fixes topic category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

scale parsed as float in ONNX scaled_dot_product_attention implementation

5 participants