Variational methods continued
- đ¤ Speaker: David Knowles, Machine learning group, University of Cambridge
- đ Date & Time: Wednesday 03 March 2010, 16:30 - 17:30
- đ Venue: MR5, CMS
Abstract
I will continue Silvia’s discussion of variational methods, and try to answer some of the questions raised during her talk. We saw how mean field inference can give a lower bound on the log partition function Z. I will describe a general message passing framework based on alpha divergences, which has mean field and expectation propagation (EP, a generalisation of belief propagation for continuous random variables) as special cases, as well as tree reweighted belief propagation, which can give an upper bound on Z. EP will be shown to greatly outperform both Gibbs sampling and mean field on certain problems. If there is enough time I will present some work on choosing the optimal tree to approximate a loopy graph or even the right distribution over trees.
Series This talk is part of the Statistics Reading Group series.
Included in Lists
- All CMS events
- All Talks (aka the CURE list)
- bld31
- CMS Events
- DPMMS info aggregator
- DPMMS lists
- DPMMS Lists
- Hanchen DaDaDash
- Interested Talks
- MR5, CMS
- School of Physical Sciences
- Statistical Laboratory info aggregator
- Statistics Group
- Statistics Reading Group
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Wednesday 03 March 2010, 16:30-17:30