University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Consistency and CLTs for stochastic gradient Langevin dynamics based on subsampled data

Consistency and CLTs for stochastic gradient Langevin dynamics based on subsampled data

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Mustapha Amrani.

Advanced Monte Carlo Methods for Complex Inference Problems

Co-authors: Alexandre Thiery (National University of Singapore), Yee-Whye Teh (University of Oxford)

Applying MCMC to large data sets is expensive. Both calculating the acceptance probability and creating informed proposals depending on the likelihood require an iteration through the whole data set. The recently proposed Stochastic Gradient Langevin Dynamics (SGLD) circumvents this problem by generating proposals based on only a subset of the data and skipping the accept-reject step. In order to heuristically justify the latter, the step size converges to zero in a non-summable way.

Under appropriate Lyapunov conditions, we provide a rigorous foundation for this algorithm by showing consistency of the weighted sample average and proving a CLT for it. Surprisingly, the fraction of the data subset selection does not have an influence on the asymptotic variance.

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity