-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make Multinomial robust against batches #4169
Conversation
Seems like there are legit test errors. |
Yeah, I'll go through those tomorrow. |
bc67073
to
6ca644d
Compare
@@ -597,14 +597,10 @@ def __init__(self, n, p, *args, **kwargs): | |||
super().__init__(*args, **kwargs) | |||
|
|||
p = p / tt.sum(p, axis=-1, keepdims=True) | |||
n = np.squeeze(n) # works also if n is a tensor |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pretty simple fix, just remove some code 👍
Also needs a line in the release-notes. |
Codecov Report
@@ Coverage Diff @@
## master #4169 +/- ##
==========================================
+ Coverage 88.76% 88.77% +0.01%
==========================================
Files 89 89
Lines 14083 14079 -4
==========================================
- Hits 12501 12499 -2
+ Misses 1582 1580 -2
|
Thanks @lucianopaz! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @lucianopaz ! Funny to see that removing code fixed the issue 😅
At the moment, our
Multinomial
distribution mangles then
parameter's shape. This makes it very difficult to work with batches that have more than 2 dimensions. This PR addresses the problem and adds a test for it.Thank your for opening a PR!
Depending on what your PR does, here are a few things you might want to address in the description: