Replies: 2 comments 10 replies
-
Thanks for the suggestion, support for outlier-robust inference would be quite useful in many places.
Indeed. Looking at the Laplace approximation derived in Sec 3.1 of https://papers.nips.cc/paper/2009/file/13fe9d84310e77f13a6d184dbf1232f3-Paper.pdf that was used in the paper, this should be pretty straightforward to implement. I haven't looked into this in detail, but one should be able to implement this pretty straightforwardly by just following the Gaussian Likelihood implementation and modifying the I'd be happy to help review a gpytorch PR. There is a related gpytorch issue about Student-t processes (which are somewhat different than using a Student-t likelihood): cornellius-gp/gpytorch#1858. cc @qingfeng10 had thought about looking into this in the past as well. |
Beta Was this translation helpful? Give feedback.
-
I did a bit more literature research on this topic, I just note it down here:
|
Beta Was this translation helpful? Give feedback.
-
Hi,
first of all, sorry for creating so many discussions and issues in the moment. We are currently evaluating which steps to do next in a our BO endeavour ;) I promise, that I will also implement some of the proposed solutions and you will see PRs.
I came across this paper today http://proceedings.mlr.press/v84/martinez-cantin18a/martinez-cantin18a.pdf, which I found very interesting. They are using a GP with an approximate Student T likelihood to detect outliers in the BO loop. The predictive distribution of the approximate student T likelihood which allows for exact inference is shown in equation 8.
So far, I never touched likelihoods in Gpytorch, but from what I saw this would amount to implementing a new likelihood or? Could you give me some hints how to best implement the proposed approach, or do you know if it is already available?
Best,
Johannes
Beta Was this translation helpful? Give feedback.
All reactions