University of Cambridge > Talks.cam > Machine Learning @ CUED > Differential Geometric MCMC Methods

Differential Geometric MCMC Methods

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Zoubin Ghahramani.

In recent years a reliance on MCMC methods has been developing as the “last resort” to perform inference over increasingly sophisticated statistical models used to describe complex phenomena. This presents a major challenge as issues surrounding correct and efficient MCMC -based statistical inference over such models are of growing importance. This talk will argue that differential geometry provides the tools required to develop MCMC sampling methods suitable for challenging statistical models. By defining appropriate Riemannian metric tensors and corresponding Levi-Civita manifold connections MCMC methods based on Langevin diffusions across the model manifold are developed. Furthermore proposal mechanisms which follow geodesic flows across the manifold will be presented. The optimality of these methods in terms of mixing time shall be discussed and the strengths (and weaknesses) of such methods will be experimentally assessed on a range of statistical models such as Log-Gaussian Cox Point Process models and Mixture Models, inference over Latent Dirichlet Allocation and Copula Process style models will also be considered. This talk is based on work that was presented as a Discussion Paper to the Royal Statistical Society and a dedicated website with Matlab codes is available at http://www.ucl.ac.uk/statistics/research/rmhmc

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2020 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity