Skip to content

RuntimeError when using a += b but not when doing a= a + b #2425

@MarkTension

Description

@MarkTension

With PyTorch I'm having a problem doing an operation with two Variables:

    sub_patch  : [torch.FloatTensor of size 9x9x32]

    pred_patch : [torch.FloatTensor of size 5x5x32]

sub_patch is a Variable made by torch.zeros
pred_patch is a Variable of which I index each of the 25 nodes with a nested for-loop, and that I multiply with its corresponding unique filter (sub_filt_patch) of size [5,5,32]. The result is added to its respective place in sub_patch.

This is a piece of my code:

for i in range(filter_sz):
     for j in range(filter_sz):

         # index correct filter from filter tensor
         sub_filt_col = (patch_col + j) * filter_sz
         sub_filt_row = (patch_row + i) * filter_sz

         sub_filt_patch = sub_filt[sub_filt_row:(sub_filt_row + filter_sz), sub_filt_col:(sub_filt_col+filter_sz), :]

         # multiply filter and pred_patch and sum onto sub patch
         sub_patch[i:(i + filter_sz), j:(j + filter_sz), :] += (sub_filt_patch * pred_patch[i,j]).sum(dim=3)

The error I get from the bottom line of the piece of code here is

RuntimeError: in-place operations can be only used on variables that don't share storage with any other variables, but detected that there are 2 objects sharing it

This Error is not raised when using:

sub_patch[i:(i + filter_sz), j:(j + filter_sz), :] = sub_patch[i:(i + filter_sz), j:(j + filter_sz), :] + (sub_filt_patch * pred_patch[i,j]).sum(dim=3)
instead of

sub_patch[i:(i + filter_sz), j:(j + filter_sz), :] += (sub_filt_patch * pred_patch[i,j]).sum(dim=3)
Could this be a bug?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions