This repository contains code for "Information-Theoretic Analysis of Epistemic Uncertainty in Bayesian Meta-learning" - Sharu Theresa Jose, Sangwoo Park, and Osvaldo Simeone
This program is written in python 3.8 and uses PyTorch 1.8.1.
- C-MINE with SMILE can be found in
utils/cmi.py
- ME(M)R computation (eq 11, 13) can be found in
memr.py
. Detailed usage can be found below. - Information-theoretic bounds computation (eq 15) can be found in
mi_para.py
(parameter-level sensitivity) andmi_hyper.py
(hyperparameter-level sensitivity). Detailed usage can be found below.
-
Data generation
Execute
runs/small_std_W/total_data_gen.sh
For the default settings and other argument options, please refer todatagen.py
-
MER:
Execute
runs/small_std_W/memr_conven_over_m.sh
For the default settings and other argument options, please refer tomemr.py
-
MEMR:
Execute
runs/small_std_W/memr_over_m.sh
andruns/small_std_W/memr_over_N.sh
For the default settings and other argument options, please refer tomemr.py
-
Information-theoretic bounds (hyperparameter-level sensitivity):
Execute
runs/small_std_W/mi_hyper_over_m.sh
andruns/small_std_W/mi_hyper_over_N.sh
For the default settings and other argument options, please refer tomi_hyper.py
-
Information-theoretic bounds (parameter-level sensitivity):
Execute
runs/small_std_W/mi_para_over_m.sh
For the default settings and other argument options, please refer tomi_para.py
- change
/small_std_W/
to/large_std_W/
in the Fig. 2 case.