Deep Learning in High Dimension: Neural Network Approximation of Analytic Maps of Gaussians.
- ๐ค Speaker: Christoph Schwab (ETH Zรผrich)
- ๐ Date & Time: Wednesday 01 December 2021, 17:00 - 18:30
- ๐ Venue: Seminar Room 2, Newton Institute
Abstract
For artificial deep neural networks with ReLU activation,we prove new expression rate bounds forparametric, analytic functions wherethe parameter dimension could be infinite.Approximation rates are in mean square on the unboundedparameter range with respect to product gaussian measure.Approximation rate bounds are free from the CoD, anddetermined by summability of Wiener-Hermite PC expansion coefficients.Sufficient conditions for summability are quantified holomorphyon products of strips in the complex domain.Applications comprise DNN expression rate bounds of deep-NNsfor response surfaces of elliptic PDEs with log-gaussianrandom field inputs, and for the posterior densities of thecorresponding Bayesian inverse problems.Variants of proofs which are constructive are outlined.(joint work with Jakob Zech, University of Heidelberg, Germany, and with Dinh Dung and Nguyen Van Kien, Hanoi, Vietnam)References:https://math.ethz.ch/sam/research/reports.html?id=982
Series This talk is part of the Isaac Newton Institute Seminar Series series.
Included in Lists
- All CMS events
- bld31
- dh539
- Featured lists
- INI info aggregator
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
- Seminar Room 2, Newton Institute
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Christoph Schwab (ETH Zรผrich)
Wednesday 01 December 2021, 17:00-18:30