BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Machine Learning @ CUED
SUMMARY:Probabilistic Dimensional Reduction with the Gauss
ian Process Latent Variable Model - Dr Neil Lawren
ce\, Computer Science\, University of Sheffield
DTSTART;TZID=Europe/London:20060307T123000
DTEND;TZID=Europe/London:20060307T133000
UID:TALK4822AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/4822
DESCRIPTION: Density modelling in high dimensi
ons is a\n very difficult problem.
Traditional approaches\, such\n a
s mixtures of Gaussians\, typically fail to captur
e\n the structure of data sets in
high dimensional\n spaces. In this
talk we will argue that for many data\n
sets of interest\, the data can be represen
ted as a\n lower dimensional manif
old immersed in the higher\n dimen
sional space. We will then present the Gaussian\n
Process Latent Variable Model (GP-
LVM)\, a non-linear\n probabilisti
c variant of principal component analysis\n
(PCA) which implicitly assumes that the
data lies on\n a lower dimensional
space.\n\n Having introduced the
GP-LVM we will review\n extensions
to the algorithm\, including dynamics\,\n
learning of large data sets and back cons
traints. We\n will demonstrate th
e application of the model and its\n
extensions to a range of data sets\, including
human\n motion data\, a vowel data
set and a robot mapping\n problem
.\n
LOCATION:LR10\, Engineering\, Department of
CONTACT:Zoubin Ghahramani
END:VEVENT
END:VCALENDAR