BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Beating the Curse of Dimensionality: A Theoretical Analysis of Dee
 p Neural Networks and Parametric PDEs - Gitta Kutyniok (Technische Univers
 ität Berlin)
DTSTART:20190620T132000Z
DTEND:20190620T141000Z
UID:TALK126286@talks.cam.ac.uk
CONTACT:INI IT
DESCRIPTION:High-dimensional parametric partial differential equations (PD
 Es) appear in various contexts including control and optimization problems
 \, inverse problems\, risk assessment\, and uncertainty quantification. In
  most such scenarios the set of all admissible solutions associated with t
 he parameter space is inherently low dimensional. This fact forms the foun
 dation for the so-called reduced basis method.<br> <br> Recently\, numeric
 al experiments demonstrated the remarkable efficiency of using deep neural
  networks to solve parametric problems. In this talk\, we will present a t
 heoretical justification for this class of approaches. More precisely\, we
  will derive upper bounds on the complexity of ReLU neural networks approx
 imating the solution maps of parametric PDEs. In fact\, without any knowle
 dge of its concrete shape\, we use the inherent low-dimensionality of the 
 solution manifold to obtain approximation rates which are significantly su
 perior to those provided by classical approximation results. We use this l
 ow-dimensionality to guarantee the existence of a reduced basis. Then\, fo
 r a large variety of parametric PDEs\, we construct neural networks that y
 ield approximations of the parametric maps not suffering from a curse of d
 imensionality and essentially only depending on the size of the reduced ba
 sis. <br> <br> This is joint work with Philipp Petersen (Oxford)\, Mones R
 aslan\, and Reinhold Schneider.<br>
LOCATION:Seminar Room 1\, Newton Institute
END:VEVENT
END:VCALENDAR
