Skip to content

Fix missing final activation in NLLLoss second example#12703

Closed
grjhuard wants to merge 1 commit intopytorch:masterfrom
grjhuard:master
Closed

Fix missing final activation in NLLLoss second example#12703
grjhuard wants to merge 1 commit intopytorch:masterfrom
grjhuard:master

Conversation

@grjhuard
Copy link
Copy Markdown
Contributor

Fixed the second example in NLLLoss.
The LogSoftmax activation was missing after the convolution layer. Without this activation, the second example loss was sometimes negative.

Copy link
Copy Markdown
Contributor

@ezyang ezyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

okey-dokey

Copy link
Copy Markdown
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ezyang is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary:
Fixed the second example in NLLLoss.
The LogSoftmax activation was missing after the convolution layer. Without this activation, the second example loss was sometimes negative.
Pull Request resolved: pytorch#12703

Differential Revision: D10419694

Pulled By: ezyang

fbshipit-source-id: 98bfefd1050290dd5b29d3ce18fe075103db4674
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants