fix as_strided_scatter_backward#87646
Conversation
[ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/87646
Note: Links to docs will display an error until the docs builds have been completed. ❌ 2 FailuresAs of commit 874f298: The following jobs have failed:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
albanD
left a comment
There was a problem hiding this comment.
Great catch!
Also I guess this means we don't have opinfo (and thus autograd testing) for these ops... We should add them!
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. [ghstack-poisoned]
|
we had a few skipped's OpInfo's for |
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. cc albanD [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes #88105 cc albanD [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes #88105 cc albanD [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes #88105 cc albanD [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes #88105 cc albanD [ghstack-poisoned]
|
If this goes in after https://github.com/pytorch/pytorch/pull/85583/files then it should review the skips on the OpInfo |
|
@pytorchbot revert -m 'Sorry for reverting your PR but I think this one or one of the PR in the stack break bionic-cuda11.7 on trunk https://hud.pytorch.org/pytorch/pytorch/commit/70782981f06a042796d4604df2ec1491f4f5b194' -c nosignal |
|
@pytorchbot successfully started a revert job. Check the current status here. |
|
@bdhirsh your PR has been successfully reverted. |
This reverts commit f9d7985. Reverted #87646 on behalf of https://github.com/huydhn due to Sorry for reverting your PR but I think this one or one of the PR in the stack break bionic-cuda11.7 on trunk https://hud.pytorch.org/pytorch/pytorch/commit/70782981f06a042796d4604df2ec1491f4f5b194
This reverts commit 71fb763. [ghstack-poisoned]
This reverts commit 71fb763. [ghstack-poisoned]
This reverts commit 71fb763. [ghstack-poisoned]
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes pytorch#88105 Pull Request resolved: pytorch#87646 Approved by: https://github.com/albanD
This reverts commit f9d7985. Reverted pytorch#87646 on behalf of https://github.com/huydhn due to Sorry for reverting your PR but I think this one or one of the PR in the stack break bionic-cuda11.7 on trunk https://hud.pytorch.org/pytorch/pytorch/commit/70782981f06a042796d4604df2ec1491f4f5b194
This reverts commit 71fb763. Pull Request resolved: #88342 Approved by: https://github.com/zou3519
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory. Fixes pytorch#88105 Pull Request resolved: pytorch#87646 Approved by: https://github.com/albanD
This reverts commit f9d7985. Reverted pytorch#87646 on behalf of https://github.com/huydhn due to Sorry for reverting your PR but I think this one or one of the PR in the stack break bionic-cuda11.7 on trunk https://hud.pytorch.org/pytorch/pytorch/commit/70782981f06a042796d4604df2ec1491f4f5b194
This reverts commit 71fb763. Pull Request resolved: pytorch#88342 Approved by: https://github.com/zou3519
|
Looks like this PR hasn't been updated in a while so we're going to go ahead and mark this as |
as_strided_scatter's derivative formula was broken - instead of making a "mask" of 1's and 0's, it would effectively make a mask of 1's and uninitialized memory.
Fixes #88105
Stack from ghstack (oldest at bottom):
cc @albanD