|COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring.|
Structural Expectation Propagation (SEP): Bayesian structure learning for networks with latent variables
If you have a question about this talk, please contact Zoubin Ghahramani.
Learning the structure of discrete Bayesian networks has been the subject of extensive research in machine learning. One of the few methods that can handle networks with latent variables is the “structural EM algorithm” which interleaves greedy structure search with the estimation of latent variables and parameters, maintaining a single best network at each step. I will describe Structural Expectation Propagation (SEP), a novel method that iteratively updates a variational posterior distribution over structure, latent variables, and parameters. Conventional EP makes local distribution updates based on a ‘context’ that captures uncertainty in the remainder of the network. SEP extends this context to include uncertainty in structure, and returns a variational distribution over structures rather than a single network. I will demonstrate the performance of SEP on synthetic problems, as well as on real-world data coming from a clinical study of asthma and allergies.
This talk is part of the Machine Learning @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
Other listsIsaac Newton Institute Distinguished Seminars jcu21's list British Epigraphy Society
Other talksProperties and applications of diamond The role of macrophages in tumour progression and metastasis The 2015 Forensic Forums The use of recent advances in electron microscopy to study the ribosome Cellular hierarchies in normal and malignant epithelial tissue The 2014 Innate Immunity Summit