[distributions] Refactor _log_sum_exp#9173
Closed
alicanb wants to merge 4 commits intopytorch:masterfrom
Closed
Conversation
added 2 commits
July 4, 2018 15:37
vishwakftw
reviewed
Jul 4, 2018
| self._validate_sample(value) | ||
| logits, value = broadcast_all(self.logits, value) | ||
| return -binary_cross_entropy_with_logits(logits, value, reduce=False) | ||
| return -binary_cross_entropy_with_logits(logits, value, reduction='none') |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
vishwakftw
reviewed
Jul 4, 2018
|
|
||
| def entropy(self): | ||
| return binary_cross_entropy_with_logits(self.logits, self.probs, reduce=False) | ||
| return binary_cross_entropy_with_logits(self.logits, self.probs, reduction='none') |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
vishwakftw
reviewed
Jul 4, 2018
|
|
||
| def entropy(self): | ||
| return binary_cross_entropy_with_logits(self.logits, self.probs, reduce=False) / self.probs | ||
| return binary_cross_entropy_with_logits(self.logits, self.probs, reduction='none') / self.probs |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
apaszke
approved these changes
Jul 6, 2018
Collaborator
|
@pytorchbot test this please |
This reverts commit ceae2be.
Contributor
facebook-github-bot
left a comment
There was a problem hiding this comment.
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
petrex
pushed a commit
to petrex/pytorch
that referenced
this pull request
Jul 16, 2018
* upstream/master: (24 commits) Implement tensor weak references (pytorch#9363) Nuke TestCollectEnv (pytorch#9459) Add test case for segmentation fault fix in grad_fn (pytorch#9457) Add peephole optimization for type_as operators. (pytorch#9316) Fix out-of-range error for test_neg (pytorch#9431) add depthwise conv support for mkldnn (pytorch#8782) Refactor `_log_sum_exp` (pytorch#9173) Add ModuleDict and ParameterDict containers (pytorch#8463) Introduce SupervisedPtr, delete THAllocator and THCDeviceAllocator (pytorch#9358) Introducing IsInf (pytorch#9169) add device to CUDAEvent (pytorch#9415) Make localScalar error message more intuitive (pytorch#9443) Only accept continguous tensors in TopK for cuda (pytorch#9441) Add support for .norm() pytorch onnx export and ReduceL1/ReduceL2 caffe2 operators (pytorch#9299) Only view() rhs of index_put if we need to (pytorch#9424) Add BatchBucketizeOp in caffe2 (pytorch#9385) Implementation of Wngrad optimizer caffe2 python wrapper and unit test on least square regression (pytorch#9001) Implementation and operator test for Wngrad optimizer (pytorch#8999) Fix segmentation fault in grad_fn (pytorch#9292) update docs (pytorch#9423) ...
goldsborough
pushed a commit
to goldsborough/pytorch
that referenced
this pull request
Jul 20, 2018
Summary: This PR removes `distributions.utils._log_sum_exp` in favor of `torch.logsumexp`. Also fixes some warnings with `reduce` arg. in `binary_cross_entropy_with_logits` Pull Request resolved: pytorch#9173 Reviewed By: SsnL Differential Revision: D8764174 Pulled By: ezyang fbshipit-source-id: b9c4136dbf0182e8ae77082e6448d23a430d5cb6
jramseyer
pushed a commit
to jramseyer/pytorch
that referenced
this pull request
Jul 30, 2018
Summary: This PR removes `distributions.utils._log_sum_exp` in favor of `torch.logsumexp`. Also fixes some warnings with `reduce` arg. in `binary_cross_entropy_with_logits` Pull Request resolved: pytorch#9173 Reviewed By: SsnL Differential Revision: D8764174 Pulled By: ezyang fbshipit-source-id: b9c4136dbf0182e8ae77082e6448d23a430d5cb6
goodlux
pushed a commit
to goodlux/pytorch
that referenced
this pull request
Aug 15, 2018
Summary: This PR removes `distributions.utils._log_sum_exp` in favor of `torch.logsumexp`. Also fixes some warnings with `reduce` arg. in `binary_cross_entropy_with_logits` Pull Request resolved: pytorch#9173 Reviewed By: SsnL Differential Revision: D8764174 Pulled By: ezyang fbshipit-source-id: b9c4136dbf0182e8ae77082e6448d23a430d5cb6
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR removes
distributions.utils._log_sum_expin favor oftorch.logsumexp. Also fixes some warnings withreducearg. inbinary_cross_entropy_with_logits