[jit] dropout symbolic_script should respect the training flag#20760
Closed
suo wants to merge 6 commits intogh/suo/41/basefrom
Closed
[jit] dropout symbolic_script should respect the training flag#20760suo wants to merge 6 commits intogh/suo/41/basefrom
suo wants to merge 6 commits intogh/suo/41/basefrom
Conversation
…flag" [jit] dropout symbolic_script should respect the training flag as title gh-metadata: pytorch pytorch 20760 gh/suo/41/head
…flag" [jit] dropout symbolic_script should respect the training flag as title gh-metadata: pytorch pytorch 20760 gh/suo/41/head
Contributor
|
is it possible to add a test to prevent regression in the future? |
zdevito
reviewed
May 21, 2019
torch/csrc/jit/symbolic_script.cpp
Outdated
| res = mask * input / p1m | ||
| p1m = 1. | ||
| res = input | ||
| mask = torch.zeros(0) |
Contributor
There was a problem hiding this comment.
The definition of backwards when is_training = False is not correct.
Member
Author
There was a problem hiding this comment.
oops this got merged out somehow
…flag" [jit] dropout symbolic_script should respect the training flag as title gh-metadata: pytorch pytorch 20760 gh/suo/41/head
zdevito
approved these changes
May 21, 2019
…flag" [jit] dropout symbolic_script should respect the training flag as title. This unfortunately means that the forward for dropout doesn't fuse completely anymore, but the "important" parts are fused, and all we're adding is the is_training check overhead. The only time we're doing "extra" stuff is if we 1) require_grad and 2) are not training, which seems like uncommon things. gh-metadata: pytorch pytorch 20760 gh/suo/41/head
…flag" [jit] dropout symbolic_script should respect the training flag as title. This unfortunately means that the forward for dropout doesn't fuse completely anymore, but the "important" parts are fused, and all we're adding is the is_training check overhead. The only time we're doing "extra" stuff is if we 1) require_grad and 2) are not training, which seems like uncommon things. gh-metadata: pytorch pytorch 20760 gh/suo/41/head
Contributor
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Stack from ghstack:
as title
Differential Revision: D15486511