Skip to content

Allow target.requires_grad in l1_loss and mse_loss#3876

Merged
ezyang merged 3 commits intopytorch:masterfrom
szagoruyko:loss-target
Nov 27, 2017
Merged

Allow target.requires_grad in l1_loss and mse_loss#3876
ezyang merged 3 commits intopytorch:masterfrom
szagoruyko:loss-target

Conversation

@szagoruyko
Copy link
Copy Markdown
Contributor

This bug cost me a lot of time in several projects, 0.2 would silently accept target.requires_grad and would backprop zeroes in l1_loss and mse_loss. It's fixed in master already (with exception), and this PR allows target.requires_grad by explicitly defining the loss functions.

Copy link
Copy Markdown
Contributor

@apaszke apaszke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, but please fix the typo

Comment thread torch/nn/functional.py Outdated
return loss.sum()


def _poinwise_loss(lambd, lambd_optimized, input, target, size_average=True, reduce=True):

This comment was marked as off-topic.

@apaszke
Copy link
Copy Markdown
Contributor

apaszke commented Nov 26, 2017

linter still has a few complaints

@apaszke
Copy link
Copy Markdown
Contributor

apaszke commented Nov 27, 2017

@pytorchbot add to whitelist

@ezyang ezyang merged commit 11c9bd6 into pytorch:master Nov 27, 2017
@soumith soumith added the 0.3.1 label Feb 4, 2018
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants