Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Re-enable passing dims alongside shape or size #5325

Merged
merged 3 commits into from
Jan 9, 2022

Conversation

michaelosthege
Copy link
Member

This PR makes the following possible (which was supported in v3 already).

with pm.Model():
    pm.Normal("n", size=3, dims="town")

Changes in this PR

  • Dropping the ndim_resize item from the tuple returned by resize_from_shape and resize_from_dims, because it was simply the length of the second item and not used anywhere.
  • Removing the raise ValueError blocks that denied passing dims alongside shape or size.
  • Refactoring/addition of tests to validate that passing shape+dims or size+dims also works when combined with Ellipsis.

Depending on what your PR does, here are a few things you might want to address in the description:

  • what are the (breaking) changes that this PR makes? → it undoes a breaking change
  • important background, or details about the implementation
  • are the changes—especially new features—covered by tests and docstrings?
  • linting/style checks have been run
  • consider adding/updating relevant example notebooks
  • right before it's ready to merge, mention the PR in the RELEASE-NOTES.md not needed, because this feature worked in v3 and users most probably assume already that passing dims in addition or size or shape is valid.

@michaelosthege michaelosthege added this to the v4.0.0b2 milestone Jan 8, 2022
@michaelosthege michaelosthege self-assigned this Jan 8, 2022
@codecov
Copy link

codecov bot commented Jan 8, 2022

Codecov Report

Merging #5325 (ab5dd7e) into main (664a447) will increase coverage by 0.05%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #5325      +/-   ##
==========================================
+ Coverage   80.18%   80.23%   +0.05%     
==========================================
  Files          89       89              
  Lines       14863    14857       -6     
==========================================
+ Hits        11918    11921       +3     
+ Misses       2945     2936       -9     
Impacted Files Coverage Δ
pymc/distributions/distribution.py 91.37% <100.00%> (+0.95%) ⬆️
pymc/distributions/shape_utils.py 98.87% <100.00%> (ø)
pymc/parallel_sampling.py 87.70% <0.00%> (+0.99%) ⬆️
pymc/backends/report.py 91.78% <0.00%> (+2.05%) ⬆️

@michaelosthege michaelosthege changed the title Support passing dims alongside shape or size Re-enable passing dims alongside shape or size Jan 8, 2022
Copy link
Member

@ricardoV94 ricardoV94 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Question: what happens if dims and observed mismatch?

@michaelosthege
Copy link
Member Author

michaelosthege commented Jan 9, 2022

Question: what happens if dims and observed mismatch?

The same as when dims and shape, or dims and size or dims and the actual shape mismatch: You get shape errors in MCMC or upon InferenceData conversion.

It would be great to make this more robust for the user.
For example by validating dim lengths in variable registration, or right before sampling.
I'm constantly running into these kinds of shape problems myself, but I'm also afraid of the performance hit from doing all those .eval()s.

We could also consider to change the signature of Model.dim_lengths: Dict[str, Variable] to Model.dim_lengths: Dict[str, Union[Variable, int]] and store ints for dims that don't change. (e.g. when they were registered from a TensorConstant.)
That would make it a lot faster to perform the aforementioned checks.

@michaelosthege michaelosthege merged commit 9155922 into pymc-devs:main Jan 9, 2022
@michaelosthege michaelosthege deleted the fix-4656 branch January 9, 2022 22:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants