Probability seminar: Distributed Bayesian Posterior Sampling via Moment Sharing

13 March 2015, 2.15 PM - 13 March 2015, 3.15 PM

SM3, School of Mathematics

Yee Whye Teh, University of Oxford

In the age of Big Data, statistical methods that can handle large scale datasets are increasingly important. Recently there has been a spade of Markov chain Monte Carlo (MCMC) algorithms for approximating the posterior distribution given a large dataset distributed across a cluster of machines. Many such algorithms take an "embarrassingly parallel" approach, where independent Markov chains are run on each machine based on the local data stored there without communication across machines, and a final combination step merges the samples somehow to form an approximation to the posterior given all the data. 

We will argue that such algorithms, by their embarrassingly parallel nature, can lead to widely differing machine-specific posteriors, and as a result are highly inefficient and difficult to tune. We propose instead an algorithm whereby moment statistics of the machine-specific posteriors are collected from each sampler and propagated across the cluster using Expectation Propagation (EP), a message passing variational inference framework. EP enforces the machine-specific posteriors to share the same moments and as a result the machine-specific posteriors are much more in agreement, leading to increased performance. We demonstrate the speed and inference quality of our method with empirical studies on Bayesian logistic regression and sparse linear regression with a spike-and-slab prior.

Joint work with Minjie Xu, Balaji Lakshminarayanan, Jun Zhu, Bo Zhang

Contact information

Organisers: Marton Balazs, Haeran Cho

Edit this page