Skip to content

Fix no source name in backward kernel names; Add flex_attention HOP to "original_aten" node meta#167749

Closed
yushangdi wants to merge 1 commit intomainfrom
sy_kernel_bw_name
Closed

Fix no source name in backward kernel names; Add flex_attention HOP to "original_aten" node meta#167749
yushangdi wants to merge 1 commit intomainfrom
sy_kernel_bw_name

Conversation

@yushangdi
Copy link
Contributor

@yushangdi yushangdi commented Nov 13, 2025

Fixes #167706

  • Add torch.fx.experimental.proxy_tensor.set_original_aten_op() around flex_atention HOP dispatch so we have original_aten populated for flex_attention
  • Update the usages of original_aten to also expect HOP in addition to OpOverload

cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @chenyang78 @kadeng @muchulee8 @amjames @chauhang @aakhundov @coconutruben @Lucaskabela

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/167749

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (4 Unrelated Failures)

As of commit 4e35032 with merge base a76dd6b (image):

FLAKY - The following jobs failed but were likely due to flakiness present on trunk:

UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: inductor is only applicable to issues and has been removed. Please only use this label on issues.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: inductor is only applicable to issues and has been removed. Please only use this label on issues.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: inductor is only applicable to issues and has been removed. Please only use this label on issues.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: dynamo is only applicable to issues and has been removed. Please only use this label on issues.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: dynamo is only applicable to issues and has been removed. Please only use this label on issues.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: inductor is only applicable to issues and has been removed. Please only use this label on issues.

@yushangdi yushangdi changed the title Fix no source name in backward kernel names Fix no source name in backward kernel names; Add flex_attention HOP to "original_aten" Nov 13, 2025
@yushangdi yushangdi changed the title Fix no source name in backward kernel names; Add flex_attention HOP to "original_aten" Fix no source name in backward kernel names; Add flex_attention HOP to "original_aten" node meta Nov 13, 2025
@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: inductor is only applicable to issues and has been removed. Please only use this label on issues.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: dynamo is only applicable to issues and has been removed. Please only use this label on issues.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: inductor is only applicable to issues and has been removed. Please only use this label on issues.

@drisspg
Copy link
Contributor

drisspg commented Nov 13, 2025

what do we see now? also does this apply to the generated kernels?

@yushangdi
Copy link
Contributor Author

what do we see now? also does this apply to the generated kernels?

example output: triton_tem_fused_flex_attention_backward_zeros_1

V1113 11:32:05.735000 42282 /data/users/shangdiy/pytorch/torch/_inductor/graph.py:2431] [1/0] [__output_code]             # Topologically Sorted Source Nodes: [full_default_4, flex_attention_backward], Original ATen: [aten.zeros, flex_attention_backward]
V1113 11:32:05.735000 42282 /data/users/shangdiy/pytorch/torch/_inductor/graph.py:2431] [1/0] [__output_code]             stream0 = get_raw_stream(0)
V1113 11:32:05.735000 42282 /data/users/shangdiy/pytorch/torch/_inductor/graph.py:2431] [1/0] [__output_code]             triton_tem_fused_flex_attention_backward_zeros_1.run(primals_2, primals_5, primals_6, getitem_3, buf1, tangents_1, buf3, buf4, full, full_default, convert_element_type, convert_element_type_1, buf5, buf6, buf7, buf8, buf9, s50, s41, s0, ((15 + s0) // 16) + ((15 + s50) // 16), 1, 2, stream=stream0)

@yushangdi yushangdi requested a review from ezyang November 13, 2025 20:29
@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: dynamo is only applicable to issues and has been removed. Please only use this label on issues.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: inductor is only applicable to issues and has been removed. Please only use this label on issues.

@yushangdi
Copy link
Contributor Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

Silv3S pushed a commit to Silv3S/pytorch that referenced this pull request Nov 18, 2025
…o "original_aten" node meta (pytorch#167749)

Fixes pytorch#167706

- Add `torch.fx.experimental.proxy_tensor.set_original_aten_op()` around flex_atention HOP dispatch so we have `original_aten` populated for flex_attention
- Update the usages of `original_aten` to also expect HOP in addition to OpOverload

Pull Request resolved: pytorch#167749
Approved by: https://github.com/drisspg
@github-actions github-actions bot deleted the sy_kernel_bw_name branch December 15, 2025 02:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/inductor ciflow/trunk Trigger trunk jobs on your pull request Merged release notes: fx release notes category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Improve Inductor generated kernel names: important pieces like flex attention backward not included

3 participants