BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Cambridge Analysts' Knowledge Exchange
SUMMARY:Can neural networks always be trained? On the boun
daries of deep learning - Matthew Colbrook (Univer
sity of Cambridge)
DTSTART;TZID=Europe/London:20190501T160000
DTEND;TZID=Europe/London:20190501T170000
UID:TALK124339AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/124339
DESCRIPTION:Deep learning has emerged as a competitive new too
l in image reconstruction. However\, recent result
s demonstrate such methods are typically highly un
stable - tiny\, almost undetectable perturbations
cause severe artefacts in the reconstruction\, a m
ajor concern in practice. This is paradoxical give
n the existence of stable state-of-the-art methods
for these problems. Thus\, approximation theoreti
cal results non-constructively imply the existence
of stable and accurate neural networks. Hence the
fundamental question: Can we explicitly construct
/train stable and accurate neural networks for ima
ge reconstruction? I will discuss two results in t
his direction. The first is a negative result\, sa
ying such constructions are in general impossible\
, even given access to the solutions of common opt
imisation algorithms such as basis pursuit. The se
cond is a positive result\, saying that under spar
sity assumptions\, such neural networks can be con
structed. These neural networks are stable and the
oretically competitive with state-of-the-art resul
ts from other methods. Numerical examples of compe
titive performance are also provided.
LOCATION:MR14\, Centre for Mathematical Sciences
CONTACT:
END:VEVENT
END:VCALENDAR