🚀 Feature
All the NN modules within the torch.nn namespace must accept 0-batch size tensors for forward and backward.
Follow up from #12013 since the list is long. Recommended way of solving this would be to club at least a few of the layers into a single PR.
cc @albanD @mruberry @jbschlosser
🚀 Feature
All the NN modules within the
torch.nnnamespace must accept 0-batch size tensors for forward and backward.Follow up from #12013 since the list is long. Recommended way of solving this would be to club at least a few of the layers into a single PR.
cc @albanD @mruberry @jbschlosser