Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement "Unifying View of Sparse Approximations ..." methods #84

Open
Crown421 opened this issue Dec 15, 2021 · 8 comments
Open

Implement "Unifying View of Sparse Approximations ..." methods #84

Crown421 opened this issue Dec 15, 2021 · 8 comments

Comments

@Crown421
Copy link
Member

As far as I can tell, this package currently implements a general approximation distribution q, which can be optimized "as a whole", which means as many optimization variables as there are inducing points.

There exist various approximations for q that reduce the computational effort for optimization, see e.g. A Unifying View of Sparse Approximate Gaussian Process Regression by Quinonero-Candela & Rasmussen.

I would like to give implementing those here a go in the near future, but wanted to open an issue first.

@willtebbutt
Copy link
Member

This is one way to go. You might also be interested in the more recent A unifying framework for Gaussian process pseudo-point approximations using power expectation propagation

@willtebbutt
Copy link
Member

willtebbutt commented Dec 15, 2021

As far as I can tell, this package currently implements a general approximation distribution q, which can be optimized "as a whole", which means as many optimization variables as there are inducing points.

We also have the closed-form version of the approximation here implemented in AbstractGPs, for when the likelihood is Gaussian and such a closed-form solution exists.

@Crown421
Copy link
Member Author

Crown421 commented Feb 4, 2022

I have looked into that paper, as well as related ones, and dug through the current implementation. Now I am essentially ready to implement.
My plan would be to implement something fairly general for Generative Models, re-using as much code as possible, but adjusting where necessary for DTC/FITC/PITC/ ...

However, I can come up with a few different options on how to connect to AbstractGPs.jl, and wanted to ask what the best one would be:

  1. Implement everything in AbstractGPs instead of here (extending the current implementation for VFE/ DTC).
  2. Leave the ApproxPosteriorGP struct in AbstractGPs, but move internals for generative models into this package.
  3. Move everything into this package (ApproxPosteriorGP struct and types)

I think given the idea that AbstractGPs.jl is supposed to be a lean package to build on top of (I believe?), 2/3 make more sense?

Thoughts?

@willtebbutt
Copy link
Member

Option 2 would be my preference. So everything other than the ApproxPosteriorGP struct, and lines 169-201 of the sparse_approximations.jl file would move to ApproximateGPs.jl.

We've definitely discussed this elsewhere, but I can't find the issue now...

@sharanry
Copy link

sharanry commented Mar 14, 2022

The GaussianProcesses.jl's implementation of FITC seems to redirect the predict function to DTC's mean and covariance. Am I missing something or Is that wrong?

@willtebbutt
Copy link
Member

Yeah, we could probably do that. As the comment in the GaussianProcesses.jl implementation points out, you would just need to modify the observation variance.

Specifically for FITC, I think computing the approximate log marginal probability would just involve calling _compute_intermediates with a modified noise covariance matrix in fx. I'm imagining that the same thing would be possible with approximate posterior predictions, but you would have to check.

@sharanry
Copy link

As the comment in the GaussianProcesses.jl implementation points out, you would just need to modify the observation variance.

@willtebbutt I think I missed this. Could you point me to where this is said?

@willtebbutt
Copy link
Member

Ah, sorry, I guess I'm referring to the contents of equations 21-24b in the linked paper

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants