University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Fast yet Simple Natural-Gradient Variational Inference in Complex Models

Fast yet Simple Natural-Gradient Variational Inference in Complex Models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Dr R.E. Turner.

This talk has been canceled/deleted

Approximate Bayesian inference is promising in improving generalization and reliability of deep learning, but is computationally challenging. Modern variational-inference (VI) methods circumvent the challenge by formulating Bayesian inference as an optimization problem and then solving it using gradient-based methods. In this talk, I will argue in favor of natural-gradient approaches which can improve convergence of VI by exploiting the information geometry of the solutions. I will discuss a fast yet simple natural-gradient method obtained by using a duality associated with exponential-family distributions. I will summarize some of our recent results on Bayesian deep learning, where natural-gradient methods lead to an approach which gives simpler updates than existing VI methods while performing comparably to them.

Joint work with Wu Lin (UBC), Didrik Nielsen (RIKEN), Voot Tangkaratt (RIKEN), Yarin Gal (UOxford), Akash Srivastva (UEdinburgh), Zuozhu Liu (SUTD).

Based on: https://emtiyaz.github.io/papers/isita2018_preprint.pdf https://arxiv.org/abs/1806.04854 https://arxiv.org/abs/1703.04265

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

This talk is not included in any other list

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity