-
Notifications
You must be signed in to change notification settings - Fork 0
Max Welling
Bayesian inference is a natural and elegant tool to handle uncertainty. Unfortunately, in all but trivial cases, exact Bayesian inference is computationally highly demanding. In the face of very large datasets, even approximate inference schemes, such as MCMC may become impractical. In this talk I will present progress for a class of approximate Bayesian inference methods that scale to very large datasets. These “stochastic gradient MCMC” methods use only small subsets of the data for every update and computation can be conveniently distributed across many machines. We will discuss some applications of this technique to Bayesian Topic Modeling, Collaborative Filtering and Network Models. Bayesian inference is also the method of choice in likelihood free settings, where the likelihood can only be assessed through, often highly complex, simulations. We show how the same stochastic gradient MCMC methods can also be successfully applied to high dimensional likelihood free inference problems.