Skip to content

[inductor] estimate peak memory in codegen only when buffer reuse#162300

Closed
ruisizhang123 wants to merge 1 commit intomainfrom
ruisi/relax_memory
Closed

[inductor] estimate peak memory in codegen only when buffer reuse#162300
ruisizhang123 wants to merge 1 commit intomainfrom
ruisi/relax_memory

Conversation

@ruisizhang123
Copy link
Copy Markdown
Contributor

@ruisizhang123 ruisizhang123 commented Sep 5, 2025

As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors.

The original codegen peak memory estimation code is from this PR: #159530

cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @chenyang78 @kadeng @muchulee8 @amjames @chauhang @aakhundov @coconutruben

@pytorch-bot
Copy link
Copy Markdown

pytorch-bot bot commented Sep 5, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/162300

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 76c5e85 with merge base c321111 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@ruisizhang123 ruisizhang123 changed the title [inductor] relax memory estimation node planning [inductor] estimate peak memory in codegen only when buffer reuse Sep 5, 2025
@ruisizhang123
Copy link
Copy Markdown
Contributor Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Sep 5, 2025
@pytorchmergebot
Copy link
Copy Markdown
Collaborator

Merge failed

Reason: This PR needs a release notes: label
If your changes are user facing and intended to be a part of release notes, please use a label starting with release notes:.

If not, please add the topic: not user facing label.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "topic: not user facing"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

Details for Dev Infra team Raised by workflow job

@ruisizhang123
Copy link
Copy Markdown
Contributor Author

@pytorchbot label "topic: not user facing"

@pytorch-bot pytorch-bot bot added the topic: not user facing topic category label Sep 5, 2025
@ruisizhang123
Copy link
Copy Markdown
Contributor Author

@pytorchbot merge

@pytorchmergebot
Copy link
Copy Markdown
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

daisyden pushed a commit to daisyden/pytorch that referenced this pull request Sep 8, 2025
…torch#162300)

As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors.

The original codegen peak memory estimation code is from this PR: pytorch#159530

Pull Request resolved: pytorch#162300
Approved by: https://github.com/eellison, https://github.com/v0i0
markc-614 pushed a commit to markc-614/pytorch that referenced this pull request Sep 17, 2025
…torch#162300)

As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors.

The original codegen peak memory estimation code is from this PR: pytorch#159530

Pull Request resolved: pytorch#162300
Approved by: https://github.com/eellison, https://github.com/v0i0
mansiag05 pushed a commit to mansiag05/pytorch that referenced this pull request Sep 22, 2025
…torch#162300)

As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors.

The original codegen peak memory estimation code is from this PR: pytorch#159530

Pull Request resolved: pytorch#162300
Approved by: https://github.com/eellison, https://github.com/v0i0
cleonard530 pushed a commit to cleonard530/pytorch that referenced this pull request Sep 22, 2025
…torch#162300)

As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors.

The original codegen peak memory estimation code is from this PR: pytorch#159530

Pull Request resolved: pytorch#162300
Approved by: https://github.com/eellison, https://github.com/v0i0
dsashidh pushed a commit to dsashidh/pytorch that referenced this pull request Sep 26, 2025
…torch#162300)

As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors.

The original codegen peak memory estimation code is from this PR: pytorch#159530

Pull Request resolved: pytorch#162300
Approved by: https://github.com/eellison, https://github.com/v0i0
@github-actions github-actions bot deleted the ruisi/relax_memory branch October 6, 2025 02:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants