Code for reproducing the experiments in the paper:
G. Papamakarios and I. Murray, Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation, NeurIPS 2016. [arXiv] [bibtex]
Folder containing four subfolders, one for each demo in the paper.
-
mixture_of_gaussians_demomog_main.py--- sets up the modelmog_abc.py--- runs ABC methodsmog_mdn.py--- runs MDN methodsmog_res.py--- collects and plots results
-
bayesian_linear_regression_demoblr_main.py--- sets up the modelblr_abc.py--- runs ABC methodsblr_mdn.py--- runs MDN methodsblr_res.py--- collects and plots results
-
lotka_volterra_demolv_main.py--- sets up the modellv_abc.py--- runs ABC methodslv_mdn.py--- runs MDN methodslv_res.py--- collects and plots results
-
mg1_queue_demomg1_main.py--- sets up the modelmg1_abc.py--- runs ABC methodsmg1_mdn.py--- runs MDN methodsmg1_res.py--- collects and plots results
Folder with utility classes and functions.
-
pdf.pyGaussians and mixtures of Gaussians -
NeuralNet.pyneural nets with and without SVI -
mdn.pyMDNs with and without SVI -
DataStream.pyprovides data minibatches for training -
LossFunction.pyloss functions for training -
StepStrategy.pyoptimization algorithms, including Adam -
Trainer.pytrains a neural net or MDN, SVI or not -
MarkovJumpProcess.pyMarkov jump processes, including Lotka--Volterra -
helper.pyvarious helper functions