University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Advanced Gaussian Process approximation methods

Advanced Gaussian Process approximation methods

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact mv310.

Inference and learning in regression or classification tasks using Gaussian Processes is expensive ($O(n^3)$) and practitioners often need to resort to approximation techniques. There is now a large family of so-called sparse approximation methods which summarise the observed data using a small number of pseudo-datapoints. These sparse methods can be categorised into two non-exclusive classes: indirect posterior approximations which employ a modified prior designed to approximately match the original (e.g. FITC , PITC, sparse spectrum GPs) and direct posterior approximations which explicitly optimise an approximation to the posterior. In this talk we will discuss two sparse methods in the latter scheme. The first approximates the posterior using the variational free energy approach (Titsias, 2009) and the second uses expectation propagation (Qi et al, 2010).

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2017 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity