BEGIN:VCALENDAR VERSION:2.0 PRODID:-//talks.cam.ac.uk//v3//EN BEGIN:VTIMEZONE TZID:Europe/London BEGIN:DAYLIGHT TZOFFSETFROM:+0000 TZOFFSETTO:+0100 TZNAME:BST DTSTART:19700329T010000 RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU END:DAYLIGHT BEGIN:STANDARD TZOFFSETFROM:+0100 TZOFFSETTO:+0000 TZNAME:GMT DTSTART:19701025T020000 RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU END:STANDARD END:VTIMEZONE BEGIN:VEVENT CATEGORIES:Information Theory Seminar SUMMARY:How the strength of the inductive bias affects the generalization performance of interpolators - Dr Fanny Yang\, ETH Zurich DTSTART;TZID=Europe/London:20221123T140000 DTEND;TZID=Europe/London:20221123T150000 UID:TALK179117AThttp://talks.cam.ac.uk URL:http://talks.cam.ac.uk/talk/index/179117 DESCRIPTION:Interpolating models have recently gained populari ty in the statistical learning community due to co mmon practices in modern machine learning: complex models achieve good generalization performance de spite interpolating high-dimensional training data . In this talk\, we prove generalization bounds fo r high-dimensional linear models that interpolate noisy data generated by a sparse ground truth. In particular\, we first show that minimum-l1-norm in terpolators achieve high-dimensional asymptotic co nsistency at a logarithmic rate. Further\, as oppo sed to the regularized or noiseless case\, for min -lp-norm interpolators with 1