|COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring.|
Structural Expectation Propagation (SEP): Bayesian structure learning for networks with latent variables
If you have a question about this talk, please contact Zoubin Ghahramani.
Learning the structure of discrete Bayesian networks has been the subject of extensive research in machine learning. One of the few methods that can handle networks with latent variables is the “structural EM algorithm” which interleaves greedy structure search with the estimation of latent variables and parameters, maintaining a single best network at each step. I will describe Structural Expectation Propagation (SEP), a novel method that iteratively updates a variational posterior distribution over structure, latent variables, and parameters. Conventional EP makes local distribution updates based on a ‘context’ that captures uncertainty in the remainder of the network. SEP extends this context to include uncertainty in structure, and returns a variational distribution over structures rather than a single network. I will demonstrate the performance of SEP on synthetic problems, as well as on real-world data coming from a clinical study of asthma and allergies.
This talk is part of the Machine Learning @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
Other listsClinical ID talks Cambridge University Southeast Asian Forum 9th Cambridge Immunology Forum - Visions of Immunology
Other talksLunchtime Talk: Helen's Bedroom Nuclear fusion Studies on Enzymatic Catalysis of Polar Reactions: Proton Transfer, Hydride Transfer and Decarboxylation The 2017 Sports Science Summit The particulars of particulates Helminth parasites - masters of the immune system