Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/92067
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 82f1c37: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
e18ad34 to
033f31b
Compare
a6b50cb to
6ab10d5
Compare
jgong5
left a comment
There was a problem hiding this comment.
Can you add PR description on why you made the changes and what you changed in brief?
test/dynamo/test_optimizations.py
Outdated
| self.assertEqual(r2.dtype, torch.float32) | ||
| for dynamic_shapes in [True, False]: | ||
| torch._dynamo.reset() | ||
| torch._dynamo.config.dynamic_shapes = dynamic_shapes |
There was a problem hiding this comment.
Can you use with patch to make sure the configuration changes here won't impact others?
| return BACKENDS["ipex"](gm, example_inputs, **kwargs_ipex) | ||
| @create_backend | ||
| def ipex(subgraph): | ||
| import intel_extension_for_pytorch # type: ignore[import] # noqa: F401 |
There was a problem hiding this comment.
Do "try-catch" and fall back to eager on exception?
There was a problem hiding this comment.
I checked other backends like onnx backend, they also don't have fall back path. So I'm not sure if it needs to be added.
There was a problem hiding this comment.
I saw the "tvm" backend does fallbacks. Anyway, at least, we'd better print a readable error message on import error?
There was a problem hiding this comment.
have added a readable error message, thanks!
| try: | ||
| import intel_extension_for_pytorch # type: ignore[import] # noqa: F401 | ||
| except ImportError: | ||
| log.exception("ipex backend fails. Cannot import intel_extension_for_pytorch") |
There was a problem hiding this comment.
How about "Unable to import Intel Extension for PyTorch (IPEX). Please install the right version of IPEX that matches the PyTorch version being used. Refer to https://github.com/intel/intel-extension-for-pytorch for details."
There was a problem hiding this comment.
It's better, thank you!
|
@jansel, could you please help to review this PR? thanks! |
This is a copy of #92067 to resolve the merge conflicts with the next PR in the stack. [ghstack-poisoned]
…)" This is a copy of #92067 to resolve the merge conflicts with the next PR in the stack. Go ahead and land #92067, then I can delete this one. cc mlazos soumith voznesenskym yanboliang penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx desertfire [ghstack-poisoned]
This is a copy of #92067 to resolve the merge conflicts with the next PR in the stack. Go ahead and land #92067, then I can delete this one. cc mlazos soumith voznesenskym yanboliang penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx desertfire [ghstack-poisoned]
315489b to
82f1c37
Compare
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
cc @mlazos @soumith @voznesenskym @yanboliang @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @chunyuan-w @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @desertfire