You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Maximizing the expected log posterior density (ELPD) is a common objective function in Bayesian inference and Bayesian neural networks, and I'd be interested in having it as a loss function for a package I'm building. I'd be happy to implement it if someone around here is willing to help me figure out the API (I haven't used JuliaML before; I'm building a package that does efficient LOO-CV for Bayesian models).
(Note: ELPD is the generalization of log-loss/cross-entropy loss to regression, rather than classification, problems.)
The text was updated successfully, but these errors were encountered:
Maximizing the expected log posterior density (ELPD) is a common objective function in Bayesian inference and Bayesian neural networks, and I'd be interested in having it as a loss function for a package I'm building. I'd be happy to implement it if someone around here is willing to help me figure out the API (I haven't used JuliaML before; I'm building a package that does efficient LOO-CV for Bayesian models).
(Note: ELPD is the generalization of log-loss/cross-entropy loss to regression, rather than classification, problems.)
The text was updated successfully, but these errors were encountered: