[inductor] Small refactor of CachingAutotuner#162406
[inductor] Small refactor of CachingAutotuner#162406kundaMwiza wants to merge 3 commits intopytorch:mainfrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/162406
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 7835b6c with merge base 4cf2900 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
@pytorchbot label "topic: not user facing" |
|
@exclamaforte @eellison Can I please get a review of this small PR when you're free? Thanks |
a6563ff to
cf4b1e2
Compare
cf4b1e2 to
232fb00
Compare
|
@exclamaforte @eellison bump |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
|
The merge job was canceled or timed out. This most often happen if two merge requests were issued for the same PR, or if merge job was waiting for more than 6 hours for tests to finish. In later case, please do not hesitate to reissue the merge command |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
1 similar comment
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
This is a simple refactor that just moves some logic in
_precompile_configto two new functions for separation of concerns. This will allow subclasses e.g. out of tree to configure options and metadata for triton.compile.cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @chenyang78 @kadeng @muchulee8 @amjames @chauhang @aakhundov @coconutruben