|COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring.|
Structural Expectation Propagation (SEP): Bayesian structure learning for networks with latent variables
If you have a question about this talk, please contact Zoubin Ghahramani.
Learning the structure of discrete Bayesian networks has been the subject of extensive research in machine learning. One of the few methods that can handle networks with latent variables is the “structural EM algorithm” which interleaves greedy structure search with the estimation of latent variables and parameters, maintaining a single best network at each step. I will describe Structural Expectation Propagation (SEP), a novel method that iteratively updates a variational posterior distribution over structure, latent variables, and parameters. Conventional EP makes local distribution updates based on a ‘context’ that captures uncertainty in the remainder of the network. SEP extends this context to include uncertainty in structure, and returns a variational distribution over structures rather than a single network. I will demonstrate the performance of SEP on synthetic problems, as well as on real-world data coming from a clinical study of asthma and allergies.
This talk is part of the Machine Learning @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
Other listsKettle's Yard 50th anniversary Cancer Biology 2016 Melville Laboratory Seminars
Other talksHigh strain rate deformation response of titanium for aerospace gas turbines RSC Prizewinner's seminar Dr Frank Waldron-Lynch: Immune Cell Responses in Participants with Type 1 Diabetes after doses of Interleukin-2 in adaptive-response clinical trials Extreme Risk Engineering Choice and Decision Making for Data Subjects Empowerment Graphons and Machine Learning: Modeling and Estimation of Sparse Massive Networks - Part II