Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restore pre-4625 way how the size kwarg works #4662

Closed
michaelosthege opened this issue Apr 24, 2021 · 3 comments · Fixed by #4693
Closed

Restore pre-4625 way how the size kwarg works #4662

michaelosthege opened this issue Apr 24, 2021 · 3 comments · Fixed by #4693

Comments

@michaelosthege
Copy link
Member

Due to a (big) misunderstanding, my #4625 PR broke the way how size parametrization works in v4.

The misunderstanding was specifically about the notion of implied dimensions and what it means for univariate vs. multivariate RVs:

⚠ The pre-#4625 notion was that size is in addition to the support dimensions.

⚠ The post-#4625 notion was that size is in addition to what's implied by parameters other than size/shape/dims.

The difference is subtle and maybe best explained with the following example:

# v4 pre-4625 (45cb4ebf36500e502481bdced6980dd9e630acca)
MvNormal.dist(
    cov=eye(7), mu=ones(7), size=(2,3)    # MvNormal is multivariate with ndim_support=1,
).eval().shape == (2, 3, 7)               # therefore (7,) is implied/required and does not count into `size`.
Normal.dist(
    mu=[1,2,3], size=(2,3)
).eval().shape == (2,3)                   # Normal is univariate, so `mu` does not count as a support dim

# v4 post-4625 (e9f2e9616394275ccf7587a4818fe21251d51328)
MvNormal.dist(
    cov=eye(7), mu=ones(7), size=(2,3)
).eval().shape == (2, 3, 7)
Normal.dist(
    mu=[1,2,3], size=(2,3)
).eval().shape == (2, 3, 3)         # the last dimension of length 3 was implied by mu and does not count into `size`

With the changes from #4625 the outcome from specifying shape=(1,2,3, ...) and size=(1,2,3) is identical.

After some discussion about the advantages/disadvantages of either API flavor, we decided to go back to the pre-4625 flavor where size is essentially shape but without support dimensions.
This is also the way how numpy handles dimensionality of multivariate distributions:

np.random.mvnormal(
    cov=np.eye(7), mean=np.ones(7), size=(2, 3)
).shape == (2, 3, 7)

The flexibility added by #4625, namely the ability to not specify dimensions that are implied from RV support or parameters, will continue to work through the shape with Ellipsis API:

mu = aesara.shared([1, 2, 3])

rv = Normal.dist(
    mu=mu, shape=(7, 5, ...)   # only the additional dimensions are specified explicitly
)
assert rv.eval().shape == (7, 5, 3)

# Now change the parameter-implied dimensions:
mu.set_value([1, 2, 3, 4])
assert rv.eval().shape == (7, 5, 4)
@michaelosthege michaelosthege added this to the Merge v4 into master milestone Apr 24, 2021
@michaelosthege michaelosthege self-assigned this Apr 24, 2021
michaelosthege added a commit to michaelosthege/pymc that referenced this issue Apr 24, 2021
Corresponding tests were reverted, or edited to use other parametrization flavors.
The Ellipsis feature now works with all three dimensionality kwargs.
Closes pymc-devs#4662
michaelosthege added a commit to michaelosthege/pymc that referenced this issue Apr 24, 2021
Corresponding tests were reverted, or edited to use other parametrization flavors.
The Ellipsis feature now works with all three dimensionality kwargs.
The MultinomialRV implementation was removed, because the broadcasting behavior was implemented in Aesara.
Closes pymc-devs#4662
michaelosthege added a commit to michaelosthege/pymc that referenced this issue Apr 25, 2021
Corresponding tests were reverted, or edited to use other parametrization flavors.
The Ellipsis feature now works with all three dimensionality kwargs.
The MultinomialRV implementation was removed, because the broadcasting behavior was implemented in Aesara.
Closes pymc-devs#4662
@twiecki
Copy link
Member

twiecki commented Apr 29, 2021


_____________________________ test_logp_scalar_ode _____________________________

    @pytest.mark.xfail(
        condition=sys.platform == "win32", reason="See https://github.com/pymc-devs/pymc3/issues/4652."
    )
    def test_logp_scalar_ode():
        """Test the computation of the log probability for these models"""
    
        # Differential equation
        def system_1(y, t, p):
            return np.exp(-t) - p[0] * y[0]
    
        # Parameters and inital condition
        alpha = 0.4
        y0 = 0.0
        times = np.arange(0.5, 8, 0.5)
    
        yobs = np.array(
            [0.30, 0.56, 0.51, 0.55, 0.47, 0.42, 0.38, 0.30, 0.26, 0.21, 0.22, 0.13, 0.13, 0.09, 0.09]
        )[:, np.newaxis]
    
        ode_model = DifferentialEquation(func=system_1, t0=0, times=times, n_theta=1, n_states=1)
    
        integrated_solution, *_ = ode_model._simulate([y0], [alpha])
    
        assert integrated_solution.shape == yobs.shape
    
        # compare automatic and manual logp values
        manual_logp = norm.logpdf(x=np.ravel(yobs), loc=np.ravel(integrated_solution), scale=1).sum()
        with pm.Model() as model_1:
            forward = ode_model(theta=[alpha], y0=[y0])
            y = pm.Normal("y", mu=forward, sd=1, observed=yobs)
>       pymc3_logp = model_1.logp()
     new_var = var.type.filter_variable(new_var, allow_convert=True)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = TensorType(float32, col), other = Rebroadcast{?,0}.0
allow_convert = True

    def filter_variable(self, other, allow_convert=True):
        """
        Convert a symbolic Variable into a TensorType, if compatible.
    
        For the moment, only a TensorType and GpuArrayType will be
        converted, provided they have the same number of dimensions
        and dtype and have "compatible" broadcastable pattern.
    
        """
        if not isinstance(other, Variable):
            # The value is not a Variable: we cast it into
            # a Constant of the appropriate Type.
            other = self.Constant(type=self, data=other)
    
        if other.type == self:
            return other
    
        if allow_convert:
            # Attempt safe broadcast conversion.
            other2 = self.convert_variable(other)
            if other2 is not None and other2.type == self:
                return other2
        raise TypeError(
>           f"Cannot convert Type {other.type} "
            f"(of Variable {other}) into Type {self}. "
            f"You can try to manually convert {other} into a {self}."
        )
E       TypeError: Cannot convert Type TensorType(float32, matrix) (of Variable Rebroadcast{?,0}.0) into Type TensorType(float32, col). You can try to manually convert Rebroadcast{?,0}.0 into a TensorType(float32, col).

@michaelosthege
Copy link
Member Author

@twiecki that's issue #4652 . Any insight is greatly appreciated.

twiecki pushed a commit that referenced this issue May 14, 2021
Corresponding tests were reverted, or edited to use other parametrization flavors.
The Ellipsis feature now works with all three dimensionality kwargs.
The MultinomialRV implementation was removed, because the broadcasting behavior was implemented in Aesara.
Closes #4662
@michaelosthege
Copy link
Member Author

Closed by #4693

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants