Gavin Leech

Supervisors:

Websites:

General Profile: 

I've done a few things - worked in a tech company in China, farm subsidy design in Tanzania, selling books, spreadsheet jockeying, official statistics, making websites and medical device software, data science. I think one of the most important things in the world is to build future AI systems well, whatever well means. I like hills, punk, Mexican food, writing, barbells, writing, technical solutions to philosophical problems, and writing.

Research Project Summary:

Exact inference is intractable in many realistic latent variable models. Of the available approximations, variational inference is fast, but underestimates the variance; and Markov Chain Monte Carlo estimates the variance well but is far too slow in large models (Bishop 2006, Betancourt, 2020). For policy applications, where the variance must be accurate to prevent large irreversible decisions, we thus need new methods. Extending Aitchison's 2019 work on speeding up variational autoencoders, we seek to generalise the use of tensor products for approximate inference.

The end goal is multi-sample inference for any such scheme, and we aim to implement this in a probabilistic programming language (PPL) to maximise usability and impact. There are already ‘tensorised‘ PPLs, in the weak sense of using tensor operations for arbitrary probabilistic programs with one inference scheme (e.g. Bingham et al., 2019, which uses stochastic variational inference for all runs). We seek a further abstraction for any inference scheme. In our project, ‘tensorised’ denotes the tensor products used to achieve the speedup.
 
Our chosen application is to more realistic epidemic modelling, but the method should be helpful in a huge number of other areas.
Edit this page