Logging Refactor - Remove Print Statements#139782
Logging Refactor - Remove Print Statements#139782basilwong wants to merge 1 commit intopytorch:mainfrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/139782
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 91b30fe with merge base 3e82b1f ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D61678215 |
torch/_functorch/partitioners.py
Outdated
| ) | ||
| sorted_sizes = sorted([(_size_of(i), str(i)) for i in saved_values]) | ||
| logger.debug( | ||
| f"Theoretical Per Activation Storage Sizes: {sorted_sizes}" |
There was a problem hiding this comment.
Ruff will complain about this in a second
There was a problem hiding this comment.
| f"Theoretical Per Activation Storage Sizes: {sorted_sizes}" | |
| "Theoretical Per Activation Storage Sizes: %s", {sorted_sizes} |
torch/_functorch/partitioners.py
Outdated
| ) | ||
| print( | ||
| logger.debug( | ||
| "Count of Ops Rematerialized: ", |
There was a problem hiding this comment.
| "Count of Ops Rematerialized: ", | |
| "Count of Ops Rematerialized: %s", |
|
Check the ruff errors in lintrunner, it will complain for you for a lot of this logging. |
torch/_functorch/partitioners.py
Outdated
| if node.op == "call_function": | ||
| cnt[node.target.__name__] += 1 | ||
| print(sorted(cnt.items(), key=lambda x: x[1], reverse=True)) | ||
| logger.info(sorted(cnt.items(), key=lambda x: x[1], reverse=True)) |
There was a problem hiding this comment.
| logger.info(sorted(cnt.items(), key=lambda x: x[1], reverse=True)) | |
| logger.info("%s", sorted(cnt.items(), key=lambda x: x[1], reverse=True))) |
74a1275 to
71de1ef
Compare
|
This pull request was exported from Phabricator. Differential Revision: D61678215 |
71de1ef to
35c848c
Compare
|
This pull request was exported from Phabricator. Differential Revision: D61678215 |
|
Updated the diff based on comments. Testing also confirms the output. |
35c848c to
ef52d8d
Compare
|
This pull request was exported from Phabricator. Differential Revision: D61678215 |
|
@pytorchbot label "topic: not user facing" |
ef52d8d to
56053d1
Compare
|
This pull request was exported from Phabricator. Differential Revision: D61678215 |
1 similar comment
|
This pull request was exported from Phabricator. Differential Revision: D61678215 |
56053d1 to
ef57d09
Compare
|
|
|
/easycla |
|
@pytorchbot rebase |
|
@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here |
|
Successfully rebased |
8fc228a to
93d720c
Compare
93d720c to
16d0390
Compare
|
This pull request was exported from Phabricator. Differential Revision: D61678215 |
Summary: Removes print statements and implements logging via the logging library. Hopefully this will allow more control on the level of logging when running models. Test Plan: ``` AOT_PARTITIONER_DEBUG=1 buck2 run mode/opt //aps_models/ads/icvr:icvr_launcher -- mode=local_fb_fm_v4 launcher.num_workers=2 ``` Resulting output paste: P1674535630 * Full logs paste: P1674535621 ``` pastry P1674535621 | grep "functorch/partitioners.py" | pastry ``` Logging results: P1674549514 Reviewed By: paryxyt Differential Revision: D61678215
16d0390 to
23fafe2
Compare
|
This pull request was exported from Phabricator. Differential Revision: D61678215 |
23fafe2 to
04750b2
Compare
|
This pull request was exported from Phabricator. Differential Revision: D61678215 |
Summary: Removes print statements and implements logging via the logging library. Hopefully this will allow more control on the level of logging when running models. Test Plan: ``` AOT_PARTITIONER_DEBUG=1 buck2 run mode/opt //aps_models/ads/icvr:icvr_launcher -- mode=local_fb_fm_v4 launcher.num_workers=2 ``` Resulting output paste: P1674535630 * Full logs paste: P1674535621 ``` pastry P1674535621 | grep "functorch/partitioners.py" | pastry ``` Logging results: P1674549514 Reviewed By: paryxyt Differential Revision: D61678215
04750b2 to
91b30fe
Compare
|
This pull request was exported from Phabricator. Differential Revision: D61678215 |
|
@pytorchbot merge (Initiating merge automatically since Phabricator Diff has merged) |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Summary: Removes print statements and implements logging via the logging library. Hopefully this will allow more control on the level of logging when running models. Test Plan: ``` AOT_PARTITIONER_DEBUG=1 buck2 run @mode/opt //aps_models/ads/icvr:icvr_launcher -- mode=local_fb_fm_v4 launcher.num_workers=2 ``` Resulting output paste: P1674535630 * Full logs paste: P1674535621 ``` pastry P1674535621 | grep "functorch/partitioners.py" | pastry ``` Logging results: P1674549514 Differential Revision: D61678215 Pull Request resolved: pytorch#139782 Approved by: https://github.com/paryxyt, https://github.com/jansel
Summary:
Removes print statements and implements logging via the logging library.
Hopefully this will allow more control on the level of logging when running models.
Test Plan:
Resulting output paste: P1674535630
Logging results: P1674549514
Differential Revision: D61678215