University of Cambridge > > Machine Learning Reading Group @ CUED >  Model selection in a large compositional space

Model selection in a large compositional space

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Konstantina Palla.

note: the first 45 mins will be planning and the talk will start at around 3:15pm

We often build complex probabilistic models by “composing” simpler models—using one model to generate the latent variables for another model. This allows us to express complex distributions over the observed data and to share statistical structure between different parts of a model. I’ll present a space of matrix decomposition models defined by the composition of a small number of motifs of probabilistic modeling, such as clustering, low rank factorizations, and binary latent factor models. This compositional structure can be represented by a context-free grammar whose production rules correspond to these motifs. By exploiting the structure of this grammar, we can generically and efficiently infer latent components and estimate predictive likelihood for nearly 2500 model structures using a small toolbox of reusable algorithms. Using a greedy search over this grammar, we automatically choose the decomposition structure from raw data by evaluating only a small fraction of all models. The proposed method typically finds the correct structure for synthetic data and backs off gracefully to simpler models under heavy noise. It learns sensible structures for datasets as diverse as image patches, motion capture, 20 Questions, and U.S. Senate votes, all using exactly the same code. I’ll briefly describe my ongoing work on estimating marginal likelihood in this space of models and how I think this work relates to compositional models more generally.

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2023, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity