Skip to content

Dropout2d doesn't drop channels for (C, H, W) #69801

@OverLordGoldDragon

Description

@OverLordGoldDragon

🐛 Describe the bug

Docs read that C is dropped, which does not occur for (C, H, W)

import torch
torch.manual_seed(0)
ipt = torch.ones((2, 3, 4))
print(torch.nn.Dropout2d(p=0.5)(ipt))
tensor([[[2., 2., 2., 2.],
         [0., 0., 0., 0.],
         [0., 0., 0., 0.]],
        [[2., 2., 2., 2.],
         [2., 2., 2., 2.],
         [0., 0., 0., 0.]]])

but does for (N, C, H, W)

torch.manual_seed(0)
print(torch.nn.Dropout2d(p=0.5)(ipt[None]))
tensor([[[[2., 2., 2., 2.],
          [2., 2., 2., 2.],
          [2., 2., 2., 2.]],
         [[0., 0., 0., 0.],
          [0., 0., 0., 0.],
          [0., 0., 0., 0.]]]])

It appears Dropout2d is instead implementing Dropout1d for (N, C, T).

Versions

1.10.0 via Anaconda; Windows 10

cc @ezyang @gchanan @zou3519 @bdhirsh @albanD @mruberry @jbschlosser @walterddr @kshitij12345

Metadata

Metadata

Assignees

Labels

high prioritymodule: correctness (silent)issue that returns an incorrect result silentlymodule: nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions