|COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring.|
Structural Expectation Propagation (SEP): Bayesian structure learning for networks with latent variables
If you have a question about this talk, please contact Zoubin Ghahramani.
Learning the structure of discrete Bayesian networks has been the subject of extensive research in machine learning. One of the few methods that can handle networks with latent variables is the “structural EM algorithm” which interleaves greedy structure search with the estimation of latent variables and parameters, maintaining a single best network at each step. I will describe Structural Expectation Propagation (SEP), a novel method that iteratively updates a variational posterior distribution over structure, latent variables, and parameters. Conventional EP makes local distribution updates based on a ‘context’ that captures uncertainty in the remainder of the network. SEP extends this context to include uncertainty in structure, and returns a variational distribution over structures rather than a single network. I will demonstrate the performance of SEP on synthetic problems, as well as on real-world data coming from a clinical study of asthma and allergies.
This talk is part of the Machine Learning @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
Other listsCambridge University Surgical Society CamCreative Physics - Educational Outreach
Other talkstbc Forensic trace evidence – what are the questions we need to answer? Lord Fitzwilliam and his passion for prints The 2017 Sports Science Summit Evolutionary hypotheses and early human development: findings from the Wirral Child Health and Development Study A Natural History of Sentience