Skip to content

std::bad_cast when differentiating masked_fill #1677

@philipjackson

Description

@philipjackson
import torch
from torch.autograd import Variable


X = Variable(torch.rand(256),requires_grad=True)
Y = X.masked_fill(X > 0.5, 0).sum()

Y.backward()

results in:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
/home/phil/code/mwe/mwe.py in <module>()
      6 Y = X.masked_fill(X > 0.5, 0).sum()
      7 
----> 8 Y.backward()

/home/phil/anaconda2/envs/pytorch/lib/python2.7/site-packages/torch/autograd/variable.pyc in backward(self, gradient, retain_graph, create_graph, retain_variables)
    149             Defaults to False, unless ``gradient`` is a volatile Variable.
    150         """
--> 151         torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables)
    152 
    153     def register_hook(self, hook):

/home/phil/anaconda2/envs/pytorch/lib/python2.7/site-packages/torch/autograd/__init__.pyc in backward(variables, grad_variables, retain_graph, create_graph, retain_variables)
     96 
     97     Variable._execution_engine.run_backward(
---> 98         variables, grad_variables, retain_graph)
     99 
    100 

RuntimeError: std::bad_cast

Using commit 4eb448.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions