BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Stochastic Gradient Langevin Dynamics for Large Scale Bayesian Inf
 erence - Teh\, YW (University of Oxford)
DTSTART:20140423T152500Z
DTEND:20140423T160000Z
UID:TALK52131@talks.cam.ac.uk
CONTACT:Mustapha Amrani
DESCRIPTION:The Bayesian approach to statistical machine learning is a the
 oretically well-motivated framework to learning from data.  It provides a 
 coherent framework to reasoning about uncertainties\, and an inbuilt prote
 ction against overfitting.  However\, computations in the framework can be
  expensive\, and most approaches to Bayesian computations do not scale wel
 l to the big data setting.  In this talk we propose a new computational ap
 proach for Bayesian learning from large scale datasets based on iterative 
 learning from small mini-batches. By adding the right amount of noise to a
  standard stochastic gradient optimization algorithm we show that the iter
 ates will converge to samples from the true posterior distribution as we a
 nneal the stepsize. We apply the method to logistic regression and latent 
 Dirichlet allocation\, showing state-of-the-art performance.\n\nJoint work
  with Max Welling and Sam Patterson.
LOCATION:Seminar Room 1\, Newton Institute
END:VEVENT
END:VCALENDAR
