Accelerating Variance-Reduced Stochastic Gradient Methods
- ๐ค Speaker: Derek Driggs (University of Cambridge)
- ๐ Date & Time: Monday 04 November 2019, 12:00 - 12:30
- ๐ Venue: Seminar Room 2, Gatehouse Newton Institute
Abstract
Variance reduction is a crucial tool for improving the slow convergence of stochastic gradient descent. Only a few variance-reduced methods, however, have yet been shown to directly benefit from Nesterov’s acceleration techniques to match the convergence rates of accelerated gradient methods. In this work, we show develop a universal acceleration framework that allows all popular variance-reduced methods to achieve accelerated convergence rates. The constants appearing in these rates, including their dependence on the dimension n, scale with the mean-squared-error and bias of the gradient estimator. In a series of numerical experiments, we demonstrate that versions of the popular gradient estimators SAGA , SVRG, SARAH , and SARGE using our framework significantly outperform non-accelerated versions.
Series This talk is part of the Cambridge-Imperial Computational PhD Seminar series.
Included in Lists
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Monday 04 November 2019, 12:00-12:30