-
Notifications
You must be signed in to change notification settings - Fork 27.2k
Closed
Labels
high prioritymodule: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: reductionsquansight-nackHigh-prio issues that have been reviewed by Quansight and are judged to be not actionable.High-prio issues that have been reviewed by Quansight and are judged to be not actionable.triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
🐛 Bug
import torch
torch.__version__
x = torch.ones(3, requires_grad=True)
out = x.mean([]) # gives 1, as expected
out.backward()
x.grad # gives tensor([1., 1., 1.]) which is incorrect!
torch.mean(x, []) should be equivalent to torch.mean(x) unless I am missing something. Compare the above code to the following code:
y = torch.ones(3, requires_grad=True)
out = y.mean()
out.backward()
y.grad # gives tensor([0.333, 0.333, 0.333]) as expected
This is on PyTorch 1.2. One can't actually run this code on PyTorch 1.3 (it says the gradient is not implemented).
cc @ezyang @gchanan @zou3519 @jerryzh168 @ssnl @albanD @gqchen
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
high prioritymodule: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: reductionsquansight-nackHigh-prio issues that have been reviewed by Quansight and are judged to be not actionable.High-prio issues that have been reviewed by Quansight and are judged to be not actionable.triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module