Skip to content

GaussianNLLLoss doesn't support the usual reduction='none' #53964

@almson

Description

@almson

The new Gaussian NLL Loss behaves differently from the other losses by not supporting the usual none mode. Instead, it still does a reduction over all dimensions except batch.

What I expect: gaussian_nll_loss(..., reduction='none') should return a tensor with the same shape as input, target, and var. The loss is essentially just 0.5 * (torch.log(var) + (input - target)**2 / var), which is what I want returned.

What happens: a scalar or a tensor of shape (N,) is returned. The implementation does .view(input.size(0), -1).sum(dim=1)

Why this matters: I use reduction='none' for custom masking, weighing, etc.

P.S. gaussian_nll_loss is missing from the nn.functional documentation.

cc @albanD @mruberry @jbschlosser

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: lossProblem is related to loss functionmodule: nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions