The collection of papers about combining deep learning with Bayesian nonparametric approaches
We made a concise name "deep Bayesian non-parametrics"(DBNP) to a series of work bringing the fields of deep learning and Bayesian nonparametrics together. Generally, not only DBNP means combining the neural networks with stochastic processes in Bayesian modelling, but also leveraging common and effective structures of deep learning, such as convolution, recurrence and deep hierachies in the setting of Bayesian nonparameterics, introducing nonparametric methods into structure design of neural nets, and reinterpreting neural nets as Bayesian nonparametric models from any perspective. Meanwhile, corresponding training methods designed for these models, especially approximate inference, are also our concerns.
- Deep Gaussian Processes
- Nested Variational Compression in Deep Gaussian Processes
- Training Deep Gaussian Processes using Stochastic Expectation Propagation and Probabilistic Backpropagation
- Variational Auto-encoded Deep Gaussian Processes
- Deep Gaussian Processes for Regression using Approximate Expectation Propagation
- Random Feature Expansions for Deep Gaussian Processes
- Doubly Stochastic Variational Inference for Deep Gaussian Processes
- Deep Gaussian Processes with Decoupled Inducing Inputs
- Deep Gaussian Processes with Convolutional Kernels
- Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte Carlo
- Efficient Global Optimization using Deep Gaussian Processes
- Deep Convolutional Gaussian Processes
- Deep Gaussian Processes for Multi-fidelity Modeling
- Deep Gaussian Processes with Importance-Weighted Variational Inference
- Compositional Uncertainty in Deep Gaussian Processes
- Implicit Posterior Variational Inference for Deep Gaussian Processes
- Deep Bayesian Neural Nets as Deep Matrix Gaussian Processes
- Deep Neural Networks as Gaussian Processes
- Gaussian Process Behaviour in Wide Deep Neural Networks
- Deep Convolutional Networks as Shallow Gaussian Processes
- Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes
- On the Connection between Neural Processes and Gaussian Processes with Deep Kernels
- Approximate Inference Turns Deep Networks into Gaussian Processes
- Non-Gaussian Processes and Neural Networks at Finite Widths
- Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes
- Recurrent Gaussian Processes
- Deep Recurrent Gaussian Process with Variational Sparse Spectrum Approximation
- Convolutional Gaussian Processes
- Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models
- Deep Kernel Learning
- Learning Scalable Deep Kernels with Recurrent Structure
- Stochastic Variational Deep Kernel Learning
- Semi-supervised Deep Kernel Learning: Regression with Unlabeled Data by Minimizing Predictive Variance
- Calibrating Deep Convolutional Gaussian Processes
- Differentiable Compositional Kernel Learning for Gaussian Processes
- Deep Learning with Differential Gaussian Process Flows
- Finite Rank Deep Kernel Learning
- Adaptive Deep Kernel Learning
- Stick-breaking Variational Autoencoders
- Indian Buffet Process Deep Generative Models
- Nonparametric Variational Autoencoders for Hierarchical Representation Learning
- Nonparametric Bayesian Deep Networks with Local Competition
- A Bayesian Nonparametric Topic Model with Variational Auto-encoders
- Deep Bayesian Nonparametric Tracking
- Gaussian Process Prior Variational Autoencoders
- Deep Generative Model with Beta Bernoulli Process for Modeling and Learning Confounding Factors
- Stick-breaking Neural Latent Variable Models
- Deep Bayesian Nonparametric Factor Analysis
- Deep Factors with Gaussian Processes for Forecasting