BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Isaac Newton Institute Seminar Series
SUMMARY:Isotonic regression in general dimensions - Richar
d Samworth (University of Cambridge)
DTSTART;TZID=Europe/London:20180319T143000
DTEND;TZID=Europe/London:20180319T153000
UID:TALK102580AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/102580
DESCRIPTION:Co-authors: Qiyang Han (University of Washington)
\, Tengyao Wang (University of Cambridge)\, Sabya
sachi Chatterjee (University of Illinois)
We study the least squares regression function est
imator over the class of real-valued functions on
$[0\,1]^d$ that are increasing in each coordinate.
For uniformly bounded signals and with a fixed\,
cubic lattice design\, we establish that the estim
ator achieves the minimax rate of order $n^{&minus
\;min\\{2/(d+2)\,1/d\\}}$ in the empirical $L_2$ l
oss\, up to poly-logarithmic factors. Further\, we
prove a sharp oracle inequality\, which reveals i
n particular that when the true regression functio
n is piecewise constant on $k$ hyperrectangles\, t
he least squares estimator enjoys a faster\, adapt
ive rate of convergence of $(k/n)^{min(1\,2/d)}$\,
again up to poly-logarithmic factors. Previous re
sults are confined to the case $d\\leq 2$. Finally
\, we establish corresponding bounds (which are ne
w even in the case $d=2$) in the more challenging
random \;design setting. There are two surpris
ing features of these results: first\, they demons
trate that it is possible for a global empirical r
isk minimisation procedure to be rate optimal up t
o poly-logarithmic factors even when the correspon
ding entropy integral for the function class diver
ges rapidly\; second\, they indicate that the adap
tation rate for shape-constrained estimators can b
e strictly worse than the parametric rate. \;
LOCATION:Seminar Room 1\, Newton Institute
CONTACT:INI IT
END:VEVENT
END:VCALENDAR