BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Accelerating Variance-Reduced Stochastic Gradient Methods - Derek 
 Driggs (University of Cambridge)
DTSTART:20191104T120000Z
DTEND:20191104T123000Z
UID:TALK134404@talks.cam.ac.uk
CONTACT:Matthew Colbrook
DESCRIPTION:Variance reduction is a crucial tool for improving the slow co
 nvergence of stochastic gradient descent. Only a few variance-reduced meth
 ods\, however\, have yet been shown to directly benefit from Nesterov's ac
 celeration techniques to match the convergence rates of accelerated gradie
 nt methods. In this work\, we show develop a universal acceleration framew
 ork that allows all popular variance-reduced methods to achieve accelerate
 d convergence rates. The constants appearing in these rates\, including th
 eir dependence on the dimension n\, scale with the mean-squared-error and 
 bias of the gradient estimator. In a series of numerical experiments\, we 
 demonstrate that versions of the popular gradient estimators SAGA\, SVRG\,
  SARAH\, and SARGE using our framework significantly outperform non-accele
 rated versions.
LOCATION:Seminar Room 2\, Gatehouse Newton Institute
END:VEVENT
END:VCALENDAR
