Skip to content

LaurensSluyterman/Bootstrapped_Deep_Ensembles

Repository files navigation

Bootstrapped_Deep_Ensembles

This repository contains the code that was used for the experiments in our article 'Confident Neural Network Regression With Bootstrapped Ensembles'

The code for each experiment in the paper has its own file. We based our experiment_1 on the same publicly available UCI data sets that were popularised by Hernández-Lobato et al [1], and subsequently also used by (among others) Gal and Ghahramani [2]. We used the same train-test splits and obtained our data sets from Yarin Gal's github page corresponding to his dropout paper [2].

[1] Gal, Y., & Ghahramani, Z. (2016, June). Dropout as a bayesian approximation: Representing model uncertainty in deep learning. In international conference on machine learning (pp. 1050-1059). PMLR.

[2] Hernández-Lobato, J. M., & Adams, R. (2015, June). Probabilistic backpropagation for scalable learning of bayesian neural networks. In International conference on machine learning (pp. 1861-1869). PMLR.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages