Implement public API InferenceMode and its error handling#55008
Implement public API InferenceMode and its error handling#55008ailzhang wants to merge 1 commit intopytorch:masterfrom
Conversation
💊 CI failures summary and remediationsAs of commit e2a7069 (more details on the Dr. CI page):
🕵️ 1 new failure recognized by patternsThe following CI failures do not appear to be due to upstream breakages:
|
| Job | Step | Action |
|---|---|---|
| Ensure correct trailing newlines | 🔁 rerun |
This comment was automatically generated by Dr. CI (expand for details).
Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group.
|
This pull request was exported from Phabricator. Differential Revision: D27443229 |
e89b80a to
091e0d3
Compare
|
This pull request was exported from Phabricator. Differential Revision: D27443229 |
091e0d3 to
eff01c3
Compare
|
This pull request was exported from Phabricator. Differential Revision: D27443229 |
eff01c3 to
f02f223
Compare
|
Changes compared to the version before reland: https://gist.github.com/ailzhang/86abc3b72da4b2ec598a983fa52b7476 |
bhosmer
left a comment
There was a problem hiding this comment.
THANKS for the diff from the earlier version! 😁 This LGTM but interested in your take on a) framing the scheme a little more generally and b) encapsulating the xor so callsites don't need to remember to do it etc. (per inline comments).
|
This pull request was exported from Phabricator. Differential Revision: D27443229 |
f02f223 to
9b9e792
Compare
|
This pull request was exported from Phabricator. Differential Revision: D27443229 |
9b9e792 to
670a669
Compare
) Summary: Pull Request resolved: pytorch#55008 reland of pytorch#53343 For easier review, here's a diff between the version before revert. https://www.internalfb.com/phabricator/paste/view/P361764610 Test Plan: Imported from OSS Differential Revision: D27443229 Pulled By: ailzhang fbshipit-source-id: faeaff3b6165b933c9f354d5f0344e38269fbb12
670a669 to
e2a7069
Compare
|
@ailzhang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
I tried to avoid creating new TLS, but InferenceMode::is_enabeld() is in perf critical path (TensorImpl constructor) so it seems worth adding one for it. This PR reduces one sources of instruction count increased by #55008. ``` λ ~ python compare.py <torch.utils.benchmark.utils.valgrind_wrapper.timer_interface.FunctionCounts object at 0x7f59097ef310> 100 0x0000000004854750 -100 0x0000000004854760 -4400 c10::impl::tls_is_dispatch_key_included(...) ``` [ghstack-poisoned]
I tried to avoid creating new TLS, but InferenceMode::is_enabeld() is in perf critical path (TensorImpl constructor) so it seems worth adding one for it. This PR reduces one sources of instruction count increased by #55008. ``` λ ~ python compare.py <torch.utils.benchmark.utils.valgrind_wrapper.timer_interface.FunctionCounts object at 0x7f59097ef310> 100 0x0000000004854750 -100 0x0000000004854760 -4400 c10::impl::tls_is_dispatch_key_included(...) ``` Differential Revision: [D27539230](https://our.internmc.facebook.com/intern/diff/D27539230) [ghstack-poisoned]
I tried to avoid creating new TLS, but InferenceMode::is_enabeld() is in perf critical path (TensorImpl constructor) so it seems worth adding one for it. This PR reduces one sources of instruction count increased by #55008. ``` λ ~ python compare.py <torch.utils.benchmark.utils.valgrind_wrapper.timer_interface.FunctionCounts object at 0x7f59097ef310> 100 0x0000000004854750 -100 0x0000000004854760 -4400 c10::impl::tls_is_dispatch_key_included(...) ``` ghstack-source-id: 96dd9d5 Pull Request resolved: #55238
I tried to avoid creating new TLS, but InferenceMode::is_enabeld() is in perf critical path (TensorImpl constructor) so it seems worth adding one for it. This PR reduces one sources of instruction count increased by #55008. ``` λ ~ python compare.py <torch.utils.benchmark.utils.valgrind_wrapper.timer_interface.FunctionCounts object at 0x7f59097ef310> 100 0x0000000004854750 -100 0x0000000004854760 -4400 c10::impl::tls_is_dispatch_key_included(...) ``` Differential Revision: [D27539230](https://our.internmc.facebook.com/intern/diff/D27539230) [ghstack-poisoned]
I tried to avoid creating new TLS, but InferenceMode::is_enabeld() is in perf critical path (TensorImpl constructor) so it seems worth adding one for it. This PR reduces one sources of instruction count increased by #55008. ``` λ ~ python compare.py <torch.utils.benchmark.utils.valgrind_wrapper.timer_interface.FunctionCounts object at 0x7f59097ef310> 100 0x0000000004854750 -100 0x0000000004854760 -4400 c10::impl::tls_is_dispatch_key_included(...) ``` ghstack-source-id: 84dce14 Pull Request resolved: #55238
Summary: Pull Request resolved: pytorch#55238 I tried to avoid creating new TLS, but InferenceMode::is_enabeld() is in perf critical path (TensorImpl constructor) so it seems worth adding one for it. This PR reduces one sources of instruction count increased by pytorch#55008. ``` λ ~ python compare.py <torch.utils.benchmark.utils.valgrind_wrapper.timer_interface.FunctionCounts object at 0x7f59097ef310> 100 0x0000000004854750 -100 0x0000000004854760 -4400 c10::impl::tls_is_dispatch_key_included(...) ``` Test Plan: Imported from OSS Reviewed By: ezyang Differential Revision: D27539230 Pulled By: ailzhang fbshipit-source-id: 54b0c141d59ac569a154a036b284001fca9fc560
Summary: Pull Request resolved: #55424 Pull Request resolved: #55238 I tried to avoid creating new TLS, but InferenceMode::is_enabeld() is in perf critical path (TensorImpl constructor) so it seems worth adding one for it. This PR reduces one sources of instruction count increased by #55008. ``` λ ~ python compare.py <torch.utils.benchmark.utils.valgrind_wrapper.timer_interface.FunctionCounts object at 0x7f59097ef310> 100 0x0000000004854750 -100 0x0000000004854760 -4400 c10::impl::tls_is_dispatch_key_included(...) ``` Test Plan: Imported from OSS Reviewed By: ezyang Differential Revision: D27539230 Pulled By: ailzhang fbshipit-source-id: e040877faef966dca3c2c3d5f9e9a80496c81415
Summary:
Pull Request resolved: #53343
For easier review, here's a diff between the version before revert. https://www.internalfb.com/phabricator/paste/view/P363902707
Test Plan: Imported from OSS
Differential Revision: D27443229
Pulled By: ailzhang