VinDsl.jl is a work in progress. Not quite alpha, but watch this space!
While updating of Distributions.jl for automatic differentiation is in progress, this package will rely on the current master of that package. Use Pkg.checkout("Distributions")
to be sure you have the latest updates.
For contributors: documentation of design, internals, and todos, see here
See also this presentation
Variational inference is an approximate method of statistical inference based on optimization. Unlike conventional Bayesian methods based on Markov Chain Monte Carlo (MCMC), it scales well to large and streaming datasets, making it a competitive technique for machine learning applications.
However, coding variational inference models by hand traditionally involves lots of tedious algebra and careful index accounting. New techniques like black-box variational inference and local expectation gradients allow much of this to be avoided, and implementations for some models are already possible in Stan, but no current framework allows these rapidly developing techniques to be mixed and matched by researchers.
The goal of VinDsl.jl is to provide a set of data abstractions and macros that take the pain out of coding variational inference models. In particular, because the syntactic sugar for defining models is implemented in the same language as the underlying building blocks, the entire framework is easily extensible and hackable.
- Intelligent index handling: you define the model structure, VinDsl handles the sum over indices automatically
- A set of macros for coding conjugate models and updates
- Limited support for automatic expectation-taking
- built-in support for Hidden Markov Models
- preliminary support for ADVI
- Automatic differentiation (it's coming)
- support for state-space models
- variational deep networks/autoencoders
- GPU support