[JIT] add support for torch.norm#33783
[JIT] add support for torch.norm#33783eellison wants to merge 10 commits intogh/eellison/58/basefrom
Conversation
[ghstack-poisoned]
[ghstack-poisoned]
| elif dtype is None: | ||
| return torch._C._VariableFunctions.norm(input, p, dim, keepdim=keepdim, out=out) | ||
| return torch._C._VariableFunctions.norm(input, p, dim, keepdim=keepdim, dtype=dtype, out=out) | ||
| dim = [i for i in range(ndim)] # noqa: C416 TODO: rewrite as list(range(m)) |
There was a problem hiding this comment.
I noticed this in at least one other preceding PR as well, perhaps we could consider adding this as a separate function to reduce duplicate code.
There was a problem hiding this comment.
Sounds good. I’m also going to try to add list(range()) support to TS in the next week
|
|
||
| def norm(): | ||
| c = torch.tensor([[1, 2, 3], [-1, 1, 4]], dtype=torch.float) | ||
| return torch.norm(c, p="fro"), torch.norm(c, p="nuc"), torch.norm(c), torch.norm(c, p=.5) |
There was a problem hiding this comment.
Is it not necessary to test with different function arguments, or are they left out since they are covered by standard tests in test_torch
There was a problem hiding this comment.
Yea, I was mostly testing that it compiles with different types of arguments. Since test_torch tests with different argument values and the code is the same for python/TorchScript, that should be sufficient
💊 CircleCI build failures summary and remediationsAs of commit ce36b72 (more details on the Dr. CI page): Commit ce36b72 was recently pushed. Waiting for builds... This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker. This comment has been revised 51 times. |
Fix for #20113 [ghstack-poisoned]
Fix for #20113 [ghstack-poisoned]
Fix for #20113 Differential Revision: [D20121917](https://our.internmc.facebook.com/intern/diff/D20121917) [ghstack-poisoned]
Fix for #20113 Differential Revision: [D20121917](https://our.internmc.facebook.com/intern/diff/D20121917) [ghstack-poisoned]
Fix for #20113 Differential Revision: [D20121917](https://our.internmc.facebook.com/intern/diff/D20121917) [ghstack-poisoned]
Fix for #20113 Differential Revision: [D20121917](https://our.internmc.facebook.com/intern/diff/D20121917) [ghstack-poisoned]
Fix for #20113 Differential Revision: [D20121917](https://our.internmc.facebook.com/intern/diff/D20121917) [ghstack-poisoned]
Fix for #20113 Differential Revision: [D20121917](https://our.internmc.facebook.com/intern/diff/D20121917) [ghstack-poisoned]
Summary: Relanding pytorch#33783 Pull Request resolved: pytorch#36146 Differential Revision: D20895316 Pulled By: eellison fbshipit-source-id: 9a2bc0e6bdcbd43f9abe51eadaa28f90bccafcc9
Summary: Fix for #37986 Follows the stack in #33783 stack to make functions in `torch/functional.py` resolve to their python implementations. Because the return type of `torch.unique` depends on `return_inverse` and `return_counts` I had to refactor the implementation to use our boolean_dispatch mechanism. Pull Request resolved: #38156 Differential Revision: D21504449 Pulled By: eellison fbshipit-source-id: 7efb1dff3b5c00655da10168403ac4817286ff59
Summary: Fix for pytorch#37986 Follows the stack in pytorch#33783 stack to make functions in `torch/functional.py` resolve to their python implementations. Because the return type of `torch.unique` depends on `return_inverse` and `return_counts` I had to refactor the implementation to use our boolean_dispatch mechanism. Pull Request resolved: pytorch#38156 Differential Revision: D21504449 Pulled By: eellison fbshipit-source-id: 7efb1dff3b5c00655da10168403ac4817286ff59
Summary: Fix for #37986 Follows the stack in #33783 stack to make functions in `torch/functional.py` resolve to their python implementations. Because the return type of `torch.unique` depends on `return_inverse` and `return_counts` I had to refactor the implementation to use our boolean_dispatch mechanism. Pull Request resolved: #38156 Differential Revision: D21504449 Pulled By: eellison fbshipit-source-id: 7efb1dff3b5c00655da10168403ac4817286ff59
Summary: Pull Request resolved: pytorch#33783 Fix for pytorch#20113 Test Plan: Imported from OSS Differential Revision: D20121917 Pulled By: eellison fbshipit-source-id: ffedcc40678cd80f5529ff9323088eed544e5158
Summary: Relanding pytorch#33783 Pull Request resolved: pytorch#36146 Differential Revision: D20895316 Pulled By: eellison fbshipit-source-id: 9a2bc0e6bdcbd43f9abe51eadaa28f90bccafcc9
Summary: Fix for pytorch#37986 Follows the stack in pytorch#33783 stack to make functions in `torch/functional.py` resolve to their python implementations. Because the return type of `torch.unique` depends on `return_inverse` and `return_counts` I had to refactor the implementation to use our boolean_dispatch mechanism. Pull Request resolved: pytorch#38156 Differential Revision: D21504449 Pulled By: eellison fbshipit-source-id: 7efb1dff3b5c00655da10168403ac4817286ff59
Stack from ghstack:
Fix for #20113
Differential Revision: D20121917