Conversation
The semantics is "as if" you did a reshape, but it always copied even if the input was directly view'able. Signed-off-by: Edward Z. Yang <ezyang@fb.com> [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/88314
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 FailuresAs of commit 4c0a6a4: The following jobs have failed:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
tools/autograd/derivatives.yaml
Outdated
| - name: special_spherical_bessel_j0(Tensor x) -> Tensor | ||
| x: non_differentiable | ||
|
|
||
| - name: reshape_copy(Tensor self, SymInt[] size) -> Tensor |
There was a problem hiding this comment.
Would it work to make it a copy: bool kwarg like we did for .to() https://pytorch.org/docs/master/generated/torch.Tensor.to.html?highlight=#torch.Tensor.to ?
There was a problem hiding this comment.
Even if I did add the kwarg, I would still have to add an extra operator, similar to how we have _to_copy
albanD
left a comment
There was a problem hiding this comment.
Just a question about consistency with .to()
But if you need an actual different function and the flag is not enough, the PR sounds good.
The semantics is "as if" you did a reshape, but it always copied even if the input was directly view'able. Signed-off-by: Edward Z. Yang <ezyangfb.com> [ghstack-poisoned]
The semantics is "as if" you did a reshape, but it always copied even if the input was directly view'able. Signed-off-by: Edward Z. Yang <ezyangfb.com> [ghstack-poisoned]
| TORCH_CHECK(0, "_reshape_copy not implemented for mkldnn tesnors"); | ||
| } | ||
|
|
||
| if (self.is_contiguous()) { |
There was a problem hiding this comment.
do you need this conditional, you are making a clone anyway?
There was a problem hiding this comment.
I guess I can always clone and then unsafe view... I suppose I was trying to avoid calling unsafe view in some situations haha
There was a problem hiding this comment.
The guts here don't really matter, the inside of this function is never traced.
There was a problem hiding this comment.
Unsafe view is safe really, it does check the sizes, it just doesn't set up autograd tracking that you don't need here.
There was a problem hiding this comment.
ok, I will update this
The semantics is "as if" you did a reshape, but it always copied even if the input was directly view'able. Signed-off-by: Edward Z. Yang <ezyangfb.com> [ghstack-poisoned]
|
@pytorchbot merge -f "failures are spurious" |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
The semantics is "as if" you did a reshape, but it always copied even if the input was directly view'able. Signed-off-by: Edward Z. Yang <ezyang@fb.com> Pull Request resolved: pytorch#88314 Approved by: https://github.com/albanD
The semantics is "as if" you did a reshape, but it always copied even if the input was directly view'able. Signed-off-by: Edward Z. Yang <ezyang@fb.com> Pull Request resolved: pytorch#88314 Approved by: https://github.com/albanD
Stack from ghstack (oldest at bottom):
The semantics is "as if" you did a reshape, but it always copied
even if the input was directly view'able.
Signed-off-by: Edward Z. Yang ezyang@fb.com