University of Cambridge > Talks.cam > Statistics > Concentration of tempered posteriors and of their variational approximations

Concentration of tempered posteriors and of their variational approximations

Add to your list(s) Download to your calendar using vCal

  • UserPierre Alquier, ENSAE
  • ClockFriday 16 November 2018, 16:00-17:00
  • HouseMR12.

If you have a question about this talk, please contact Dr Sergio Bacallado.

While Bayesian methods are extremely popular in statistics and machine learning, their application to massive datasets is often challenging, when possible at all. Indeed, the classical MCMC algorithms are prohibitively slow when both the model dimension and the sample size are large. Variational Bayesian methods aim at approximating the posterior by a distribution in a tractable family. Thus, MCMC are replaced by an optimization algorithm which is orders of magnitude faster. VB methods have been applied in such computationally demanding applications as including collaborative filtering, image and video processing, NLP and text processing. However, despite very nice results in practice, the theoretical properties of these approximations are usually not known. In this paper, we propose a general approach to prove the concentration of variational approximations of fractional posteriors. We apply our theory to various examples: matrix completion, Gaussian VB, nonparametric regression, mixture models and other machine learning problems.

This talk is based on joint works with James Ridgway, Nicolas Chopin and Badr-Eddine Chérief-Abdellatif

http://www.jmlr.org/papers/v17/15-290.html https://arxiv.org/abs/1706.09293 http://dx.doi.org/doi:10.1214/18-EJS1475

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity