University of Cambridge > > Statistics Reading Group > An Introduction to Variational Methods for Approximate Inference in Graphical Models

An Introduction to Variational Methods for Approximate Inference in Graphical Models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Richard Samworth.

Many graphical models of practical interest do not admit exact probabilistic inference and therefore require the use of approximations. Variational methods are deterministic approximation methods that have been extensively studied and applied in the Machine Learning community as an alternative to stochastic approximation methods like Markov Chain Monte Carlo (MCMC) methods. Some of the characteristics that make variational methods more attractive than MCMC methods are their often preferable computational cost and their ability to provide bounds on distributions. Whilst in theory they cannot generate exact results since they are based on an analytical approximation to the distribution of interest, they have proven to give similar or superior performance to MCMC methods in several real-world applications. In this talk I will explain through simple examples the basic principles and properties of variational methods and present some successful applications. I will also show how loopy belief propagation can be formulated in a variational framework and introduce a few extensions that have been derived using this viewpoint. I will finally describe the link between variational transformations and convex duality.

This talk is part of the Statistics Reading Group series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2023, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity