[inductor] estimate peak memory in codegen only when buffer reuse#162300
[inductor] estimate peak memory in codegen only when buffer reuse#162300ruisizhang123 wants to merge 1 commit intomainfrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/162300
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 76c5e85 with merge base c321111 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
4567595 to
2abf892
Compare
2abf892 to
76c5e85
Compare
|
@pytorchbot merge |
Merge failedReason: This PR needs a If not, please add the To add a label, you can comment to pytorchbot, for example For more information, see Details for Dev Infra teamRaised by workflow job |
|
@pytorchbot label "topic: not user facing" |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…torch#162300) As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors. The original codegen peak memory estimation code is from this PR: pytorch#159530 Pull Request resolved: pytorch#162300 Approved by: https://github.com/eellison, https://github.com/v0i0
…torch#162300) As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors. The original codegen peak memory estimation code is from this PR: pytorch#159530 Pull Request resolved: pytorch#162300 Approved by: https://github.com/eellison, https://github.com/v0i0
…torch#162300) As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors. The original codegen peak memory estimation code is from this PR: pytorch#159530 Pull Request resolved: pytorch#162300 Approved by: https://github.com/eellison, https://github.com/v0i0
…torch#162300) As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors. The original codegen peak memory estimation code is from this PR: pytorch#159530 Pull Request resolved: pytorch#162300 Approved by: https://github.com/eellison, https://github.com/v0i0
…torch#162300) As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors. The original codegen peak memory estimation code is from this PR: pytorch#159530 Pull Request resolved: pytorch#162300 Approved by: https://github.com/eellison, https://github.com/v0i0
As titled, this PR ensures peak memory is estimated only when buffer reuse is enabled. Without this config, some nodes' successor nodes are eliminated from memory estimation after inductor bucketing, which can cause errors.
The original codegen peak memory estimation code is from this PR: #159530
cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @chenyang78 @kadeng @muchulee8 @amjames @chauhang @aakhundov @coconutruben