Skip to content

NegativeBinomial distribution#9345

Closed
kashif wants to merge 25 commits intopytorch:masterfrom
kashif:neg-binomial
Closed

NegativeBinomial distribution#9345
kashif wants to merge 25 commits intopytorch:masterfrom
kashif:neg-binomial

Conversation

@kashif
Copy link
Contributor

@kashif kashif commented Jul 11, 2018

  • implement distribution
  • add tests
  • docs

cc @ingmarschuster

@vishwakftw
Copy link
Contributor

cc @fritzo

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link

@ingmarschuster ingmarschuster left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine modulo correcting the semantics of unnormalized probability and normalization constant 👍

@vishwakftw
Copy link
Contributor

13:40:52 ======================================================================
13:40:52 FAIL: test_negative_binomial_log_prob (__main__.TestDistributions)
13:40:52 ----------------------------------------------------------------------
13:40:52 Traceback (most recent call last):
13:40:52   File "test_distributions.py", line 918, in test_negative_binomial_log_prob
13:40:52     self._check_log_prob(NegativeBinomial(total_count, probs), ref_log_prob)
13:40:52   File "test_distributions.py", line 634, in _check_log_prob
13:40:52     asset_fn(i, val.squeeze(), log_prob)
13:40:52   File "test_distributions.py", line 916, in ref_log_prob
13:40:52     self.assertAlmostEqual(log_prob, expected, places=3)
13:40:52   File "/var/lib/jenkins/workspace/test/common.py", line 348, in assertAlmostEqual
13:40:52     self.assertEqual(x, y, prec, msg, allow_inf)
13:40:52   File "/var/lib/jenkins/workspace/test/common.py", line 296, in assertEqual
13:40:52     self.assertEqual(x.item(), y, prec, message, allow_inf)
13:40:52   File "/var/lib/jenkins/workspace/test/common.py", line 340, in assertEqual
13:40:52     super(TestCase, self).assertLessEqual(abs(x - y), prec, message)
13:40:52 AssertionError: 2.9444389791664403 not less than or equal to 0.001

Build failures are legitimate.

@kashif
Copy link
Contributor Author

kashif commented Jul 12, 2018

@vishwakftw indeed... I think it might be that the scipy nbinom uses the probability and number of successes... almost bed time here so will investigate in the morning.

Copy link
Collaborator

@fritzo fritzo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. I have not checked the math.

Could you please add a docs section to docs/source/distributions.rst ?

raise ValueError("Either `probs` or `logits` must be specified, but not both.")
if probs is not None:
self.total_count, self.probs, = broadcast_all(total_count, probs)
self.total_count = self.total_count.type_as(self.logits)

This comment was marked as off-topic.

else:
self.total_count, self.logits, = broadcast_all(total_count, logits)
self.total_count = self.total_count.type_as(self.logits)
is_scalar = isinstance(self.logits, Number)

This comment was marked as off-topic.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@kashif kashif requested review from ssnl and zou3519 as code owners July 16, 2018 11:10
@fritzo
Copy link
Collaborator

fritzo commented Jul 17, 2018

cc @alicanb


def sample(self, sample_shape=torch.Size()):
with torch.no_grad():
rate = torch.distributions.Gamma(concentration=self.total_count,

This comment was marked as off-topic.

@alicanb
Copy link
Collaborator

alicanb commented Jul 17, 2018

Code is currently failing due to divide by 0 in sample_gamma, we might need argcheck here. Gamma concentration parameter should be positive, but it's possible that it can be fed 0 here.

@kashif
Copy link
Contributor Author

kashif commented Jul 18, 2018

thanks @alicanb for the review... will get it done

@kashif
Copy link
Contributor Author

kashif commented Jul 18, 2018

@alicanb is there a way to replicate the divide by 0 error?

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@ezyang
Copy link
Contributor

ezyang commented Jul 20, 2018

@kashif We can get you the docker image for ASAN builds, will that help?

@kashif
Copy link
Contributor Author

kashif commented Jul 20, 2018

@ezyang no I have a GPU box, can you tell me how exactly to run the tests with GPU enabled? Thanks!

@lazy_property
def _gamma(self):
return torch.distributions.Gamma(concentration=self.total_count,
rate=(1. - self.probs) / self.probs)

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

This comment was marked as off-topic.

@weiyangfb weiyangfb added the ready for review (this tag is deprecated) All PRs are ready for review unless they are draft, WIP, or have undismissed requested changes label Jul 24, 2018
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ezyang is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@kashif kashif deleted the neg-binomial branch August 1, 2018 16:25
@kashif
Copy link
Contributor Author

kashif commented Aug 1, 2018

cool thanks all!

zdevito pushed a commit to zdevito/ATen that referenced this pull request Aug 1, 2018
Summary:
- [x] implement distribution
- [x] add tests
- [x] docs

cc ingmarschuster
Pull Request resolved: pytorch/pytorch#9345

Differential Revision: D8807023

Pulled By: ezyang

fbshipit-source-id: 7bf7f352dd455e0909c58dd94e1bdebba0e8b5c8
goodlux pushed a commit to goodlux/pytorch that referenced this pull request Aug 15, 2018
Summary:
- [x] implement distribution
- [x] add tests
- [x] docs

cc ingmarschuster
Pull Request resolved: pytorch#9345

Differential Revision: D8807023

Pulled By: ezyang

fbshipit-source-id: 7bf7f352dd455e0909c58dd94e1bdebba0e8b5c8
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

open source ready for review (this tag is deprecated) All PRs are ready for review unless they are draft, WIP, or have undismissed requested changes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants