BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Machine Learning @ CUED
SUMMARY:Fast Gaussian process learning for regression\, se
mi-supervised classification\, and multiway analys
is - Prof Alan Qi (Purdue U)
DTSTART;TZID=Europe/London:20120704T140000
DTEND;TZID=Europe/London:20120704T150000
UID:TALK38815AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/38815
DESCRIPTION:In this talk I will cover two topics on Gaussian p
rocess learning. First\,\n I will describe te
nsor-variate latent nonparametric Bayesian\n
models\, coupled with efficient inference methods\
, for multiway\n data analysis. Unlike classi
cal tensor decomposition models\, our\n new a
pproaches model nonlinear interactions and handle
both\n continuous and binary data. To efficie
ntly learn the InfTucker\n from data\, we dev
elop a variational inference technique on\n t
ensors. Compared with classical implementation\, t
he new technique reduces both time and space comp
lexities by several orders of magnitude. Our expe
rimental results on chemometrics and social\n
network datasets demonstrate that our new models
can achieve\n significantly higher prediction
accuracy than state-of-art tensor\n decompos
ition approaches. Furthermore\, for two dimensiona
l\n problems\, our tensor model reduces to no
nlinear stochastic\n blockmodels for network
modeling\, which I will briefly discuss in\n
the talk as well. Second\, I will describe a new s
parse Gaussian\n process model\, EigenGP\, ba
sed on Karhunen-Loeve (KL) expansions of\n a
GP prior. We can view this new approach as sparse
PCA in a\n functional space\, which not only
reduces the computational cost of\n GP infere
nce but also has the potential of further improvin
g the\n predictive performance of a full GP.
By selecting eigenfunctions\n of Gaussian ker
nels that are associated with data clusters\,\n
EigenGP is also suitable for semi-supervised le
arning. Our\n experimental results demonstrat
e improved predictive performance\n of EigenG
P over several state-of-the- art sparse GP and\n
semisupervised learning methods for regression
\, classification\,\n and semisupervised clas
sification.
LOCATION:Engineering Department\, CBL Room BE-438
CONTACT:Zoubin Ghahramani
END:VEVENT
END:VCALENDAR