Skip to content

torch.nn modules should accept 0-batch dim tensors. #38115

@v0dro

Description

@v0dro

🚀 Feature

All the NN modules within the torch.nn namespace must accept 0-batch size tensors for forward and backward.

Follow up from #12013 since the list is long. Recommended way of solving this would be to club at least a few of the layers into a single PR.

cc @albanD @mruberry @jbschlosser

Metadata

Metadata

Assignees

Labels

module: nnRelated to torch.nntrackerA tracking issuetriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions