for python 3 and any version of pytorch, when using .squeeze(), autograd will append a dimension on the backward pass, even if that dimension has been collapsed out. simple example below:
class model(nn.Module):
def __init__(self):
super(model, self).__init__()
self.linear = nn.Linear(1, 1)
def forward(self, x):
return self.linear(x).squeeze()
m = model()
d = Variable(torch.randn(1))
out = m(d)
out.backward() # RuntimeError: matrices expected, got [1 x 1 x 1], [1 x 1] at /Users/jpchen/pytorch/aten/src/TH/generic/THTensorMath.c:1428
for python 3 and any version of pytorch, when using
.squeeze(), autograd will append a dimension on the backward pass, even if that dimension has been collapsed out. simple example below: