Padding: support complex dtypes#50594
Padding: support complex dtypes#50594peterbell10 wants to merge 10 commits intogh/peterbell10/42/basefrom
Conversation
[ghstack-poisoned]
Fixes #50234 [ghstack-poisoned]
Fixes #50234 [ghstack-poisoned]
Fixes #50234 [ghstack-poisoned]
Fixes #50234 [ghstack-poisoned]
| torch.randn(807, device=device, dtype=dtype), | ||
| torch.randn(12, 14, device=device, dtype=dtype), | ||
| torch.randn(9, 6, device=device, dtype=dtype)), | ||
| torch.randn(12, 60, device=device, dtype=dtype)), |
There was a problem hiding this comment.
Why change these inputs?
There was a problem hiding this comment.
reflect mode needs to have enough data to reflect, otherwise you get an error:
RuntimeError: Argument #4: Padding size should be less than the corresponding input dimension, but got: [...]
mruberry
left a comment
There was a problem hiding this comment.
Hey @peterbell10! Overall this looks great and very thorough. I've asked @anjali411 to take a look, too. I have two questions inline that I'm curious to hear your thinking on.
Fixes #50234 [ghstack-poisoned]
Fixes #50234 [ghstack-poisoned]
Fixes #50234 [ghstack-poisoned]
|
@mruberry PTAL, added the padding mode missing from the tests. Also fixed merge conflicts with #50632. As an aside, is there anything we can do to reducing merge conflicts in the |
|
I'll take a look ASAP.
Good point. What do you think, @anjali411? |
There was a problem hiding this comment.
LGTM!
As an aside, is there anything we can do to reducing merge conflicts in the GRADIENT_IMPLEMENTED_FOR_COMPLEX list? I've had multiple conflicts for every PR that adds complex gradient support. Perhaps one entry per line and alpha-sorting would help.
@peterbell10 yeah I agree that merge conflicts are pretty annoying however I think there'd still be merge conflicts with those solutions no?
|
@anjali411 two PRs appending to the same list are guaranteed to create merge conflicts. Inserting into the middle of an alpha-sorted list only creates conflicts if the operators are next to each other in the alphabet. So yes there can still be merge conflicts, but at least some of the time it can be merged automatically. |
makes sense okay! |
Fixes #50234 Differential Revision: [D25987316](https://our.internmc.facebook.com/intern/diff/D25987316) [ghstack-poisoned]
|
[Internal test failure on phabricator] One of the caffe2 tests fail with the following error: |
ghstack-source-id: e64d52c Pull Request resolved: pytorch#50594
Fixes #50234 Differential Revision: [D25987316](https://our.internmc.facebook.com/intern/diff/D25987316) [ghstack-poisoned]
|
@anjali411 merged this pull request in db079a9. |
Summary: Pull Request resolved: pytorch#50594 Fixes pytorch#50234 Test Plan: Imported from OSS Reviewed By: pbelevich Differential Revision: D25987316 Pulled By: anjali411 fbshipit-source-id: c298b771fe52b267a86938e886ea402badecfe3e
Stack from ghstack:
Fixes #50234
Differential Revision: D25987316