Consistency and CLTs for stochastic gradient Langevin dynamics based on subsampled data
- 👤 Speaker: Vollmer, S (University of Oxford)
- 📅 Date & Time: Thursday 24 April 2014, 15:50 - 16:25
- 📍 Venue: Seminar Room 1, Newton Institute
Abstract
Co-authors: Alexandre Thiery (National University of Singapore), Yee-Whye Teh (University of Oxford)
Applying MCMC to large data sets is expensive. Both calculating the acceptance probability and creating informed proposals depending on the likelihood require an iteration through the whole data set. The recently proposed Stochastic Gradient Langevin Dynamics (SGLD) circumvents this problem by generating proposals based on only a subset of the data and skipping the accept-reject step. In order to heuristically justify the latter, the step size converges to zero in a non-summable way.
Under appropriate Lyapunov conditions, we provide a rigorous foundation for this algorithm by showing consistency of the weighted sample average and proving a CLT for it. Surprisingly, the fraction of the data subset selection does not have an influence on the asymptotic variance.
Series This talk is part of the Isaac Newton Institute Seminar Series series.
Included in Lists
- All CMS events
- bld31
- dh539
- Featured lists
- INI info aggregator
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
- Seminar Room 1, Newton Institute
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Thursday 24 April 2014, 15:50-16:25