BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Microsoft Research Cambridge\, public talks
SUMMARY:Deep Gaussian Processes - Neil Lawrence\, Universi
ty of Sheffield
DTSTART;TZID=Europe/London:20130903T150000
DTEND;TZID=Europe/London:20130903T160000
UID:TALK46963AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/46963
DESCRIPTION:In this talk we will introduce deep Gaussian proce
ss (GP) models. Deep GPs are a deep belief network
based on Gaussian process mappings. The data is m
odelled as the output of a multivariate GP. The in
puts to that Gaussian process are then governed by
another GP. A single layer model is equivalent to
a standard GP or the GP latent variable model(GPL
VM). We perform inference in the model by approxim
ate variational marginalization. This results in a
strict lower bound on the marginal likelihood of
the model which we use for model selection (number
of layers and nodes per layer). Deep belief netwo
rks are typically applied to relatively large data
sets using stochastic gradient descent for optimi
zation. Our fully Bayesian treatment allows for th
e application of deep models even when data is sca
rce. Model selection by our variational bound show
s that a five layer hierarchy is justified even wh
en modelling a digit data set containing only 150e
xamples. In the seminar we will briefly review dim
ensionality reduction via Gaussian processes\, bef
ore showing how this framework can be extended to
build deep models.
LOCATION:Auditorium\, Microsoft Research Ltd\, 21 Station R
oad\, Cambridge\, CB1 2FB
CONTACT:Microsoft Research Cambridge Talks Admins
END:VEVENT
END:VCALENDAR