BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Isaac Newton Institute Seminar Series
SUMMARY:Statistical Optimality of Stochastic Gradient Desc
ent on Hard Learning Problems through Multiple Pas
ses - Francis Bach (INRIA Paris - Rocquencourt\; E
NS - Paris)
DTSTART;TZID=Europe/London:20180628T110000
DTEND;TZID=Europe/London:20180628T114500
UID:TALK107485AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/107485
DESCRIPTION:We consider stochastic gradient descent (SGD) for
least-squares regression with potentially several
passes over the data. While several passes have be
en widely reported to perform practically better i
n terms of predictive performance on unseen data\,
the existing theoretical analysis of SGD suggests
that a single pass is statistically optimal. Whil
e this is true for low-dimensional easy problems\,
we show that for hard problems\, multiple passes
lead to statistically optimal predictions while si
ngle pass does not\; we also show that in these ha
rd models\, the optimal number of passes over the
data increases with sample size. In order to defin
e the notion of hardness and show that our predict
ive performances are optimal\, we consider potenti
ally infinite-dimensional models and notions typic
ally associated to kernel methods\, namely\, the d
ecay of eigenvalues of the covariance matrix of th
e features and the complexity of the optimal predi
ctor as measured through the covariance matrix. We
illustrate our results on synthetic experiments w
ith non-linear kernel methods and on a classical b
enchmark with a linear model. (Joint work with Lou
cas Pillaud-Vivien and Alessandro Rudi)

<
br>

LOCATION:Seminar Room 1\, Newton Institute
CONTACT:INI IT
END:VEVENT
END:VCALENDAR