Refactor TorchDistribution and wrap torch.distributions.Bernoulli#645
Merged
neerajprad merged 2 commits intodevfrom Dec 21, 2017
Merged
Refactor TorchDistribution and wrap torch.distributions.Bernoulli#645neerajprad merged 2 commits intodevfrom
neerajprad merged 2 commits intodevfrom
Conversation
19 tasks
Member
Author
|
@neerajprad After this and #647 are merged, I'll address |
neerajprad
reviewed
Dec 21, 2017
| ps = F.sigmoid(logits) | ||
| eps = get_clamping_buffer(ps) | ||
| ps = ps.clamp(min=eps, max=1-eps) | ||
| torch_dist = torch.distributions.Bernoulli(ps) |
Member
There was a problem hiding this comment.
Do any tests fail if we do not do this clamping here?
Member
Author
There was a problem hiding this comment.
Yes, your _flow tests fail. I'm glad you implemented them.
Member
There was a problem hiding this comment.
Ahh..I see. Makes sense! Will add logit support in PyTorch soon.
neerajprad
reviewed
Dec 21, 2017
| return self._param_shape[-1:] | ||
| x_shape = torch.Size(broadcast_shape(alpha.size(), beta.size(), strict=True)) | ||
| event_dim = 1 | ||
| super(Gamma, self).__init__(torch_dist, x_shape, event_dim, *args, **kwargs) |
Member
There was a problem hiding this comment.
It's good to see all of this getting absorbed into the wrapper class!
neerajprad
approved these changes
Dec 21, 2017
Member
neerajprad
left a comment
There was a problem hiding this comment.
Looks good! I am merging this.
Member
Author
|
Thanks for reviewing! |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This adds a working implementation of
pyro.distributions.torch.Bernoulli. It also simplifies individual torch wrappers by moving the.batch_shape()and.event_shape()methods up to the parent classTorchDistribution.Tested
make test-torch-distagainst thetest-rsamplebranch of PyTorch.