BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Constraining Variational Inference with Geometric Jensen-Shannon D
 ivergence - Jacob Deasy
DTSTART:20201027T131500Z
DTEND:20201027T141500Z
UID:TALK152017@talks.cam.ac.uk
CONTACT:Mateja Jamnik
DESCRIPTION:"Join us on Zoom":https://zoom.us/j/99166955895?pwd=SzI0M3pMVE
 kvNmw3Q0dqNDVRalZvdz09\n\nWe examine the problem of controlling divergence
 s for latent space regularisation in variational autoencoders. Specificall
 y\, when aiming to reconstruct example x∈ℝm via latent space z∈ℝn 
 (n≤m)\, while balancing this against the need for generalisable latent r
 epresentations. We present a regularisation mechanism based on the skew ge
 ometric-Jensen-Shannon divergence (JSGα). We find a variation in JSGα\, 
 motivated by limiting cases\, which leads to an intuitive interpolation be
 tween forward and reverse KL in the space of both distributions and diverg
 ences. We motivate its potential benefits for VAEs through low-dimensional
  examples\, before presenting quantitative and qualitative results. Our ex
 periments demonstrate that skewing our variant of JSGα\, in the context o
 f JSGα-VAEs\, leads to better reconstruction and generation when compared
  to several baseline VAEs. Our approach is entirely unsupervised and utili
 ses only one hyperparameter which can be easily interpreted in latent spac
 e.
LOCATION:Zoom
END:VEVENT
END:VCALENDAR
