|COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring.|
Structural Expectation Propagation (SEP): Bayesian structure learning for networks with latent variables
If you have a question about this talk, please contact Zoubin Ghahramani.
Learning the structure of discrete Bayesian networks has been the subject of extensive research in machine learning. One of the few methods that can handle networks with latent variables is the “structural EM algorithm” which interleaves greedy structure search with the estimation of latent variables and parameters, maintaining a single best network at each step. I will describe Structural Expectation Propagation (SEP), a novel method that iteratively updates a variational posterior distribution over structure, latent variables, and parameters. Conventional EP makes local distribution updates based on a ‘context’ that captures uncertainty in the remainder of the network. SEP extends this context to include uncertainty in structure, and returns a variational distribution over structures rather than a single network. I will demonstrate the performance of SEP on synthetic problems, as well as on real-world data coming from a clinical study of asthma and allergies.
This talk is part of the Machine Learning @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
Other listsTravel and Expeditions Graduate Development Lecture Series Computer Laboratory NetOS Group Talklets
Other talksTBC (SP Workshop) Roles of cytoskeleton in hippocampal synaptic plasticity Public Health Annual Lecture Bacteriophage 2016 The 2015 Tissue Engineering Congress tbc