BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Fundamental limits of deep generative neural networks - Helmut Bö
 lcskei (ETH Zürich)
DTSTART:20211210T100000Z
DTEND:20211210T110000Z
UID:TALK165160@talks.cam.ac.uk
DESCRIPTION:Deep neural networks have been employed very successfully as g
 enerative models for complex natural data such as images and natural\nlang
 uage. In practice this is realized by training deep networks so that they 
 realize high-dimensional probability distributions by transforming simple 
 low-dimensional distributions such as uniform or Gaussian. The aim of this
  talk is to develop understanding of the fundamental representational capa
 bilities\nof deep generative neural networks. Specifically\,&nbsp\;we show
  that every d-dimensional probability distribution of bounded support can 
 be generated through deep ReLU networks out of a 1-dimensional uniform inp
 ut distribution. What is more\, this is possible without incurring a cost&
 mdash\;in terms of approximation error as measured in Wasserstein-distance
 &mdash\;relative to generating the d-dimensional target distribution from 
 d independent random variables. This is enabled by a space-filling approac
 h which elicits the importance of network depth in driving the Wasserstein
  distance between the target distribution and its neural network approxima
 tion to zero. Finally\, we show that&nbsp\;\nthe number of bits needed to 
 encode the corresponding generative networks equals the fundamental limit 
 for encoding probability distributions as dictated by quantization theory.
LOCATION:Seminar Room 1\, Newton Institute
END:VEVENT
END:VCALENDAR
