BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Inference Group
SUMMARY:Bayesian nonparametric latent feature models - Zou
bin Ghahramani
DTSTART;TZID=Europe/London:20070502T140000
DTEND;TZID=Europe/London:20070502T150000
UID:TALK6749AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/6749
DESCRIPTION:Latent variables are an important component of man
y statistical models. Most latent variable models\
, such as mixture models\, factor analysis\, and i
ndependent components analysis (ICA)\, assume a fi
nite\, usually small number of latent variables.
However\, it may be statistically undesirable to c
onstrain the number of latent variables a priori.
Here we show how a more flexible nonparametric app
roach is possible in which the number of latent va
riables is unbounded. To do\nthis\, we describe a
probability distribution over equivalence classes
of binary matrices with a finite number of rows\,
corresponding to the data points\, and an unbounde
d number of columns\, corresponding to the\nlatent
variables. Each data point can be associated with
a subset of the possible latent variables\, which
we refer to as the latent features of that data p
oint. The binary variables in the matrix indicate
which latent feature is possessed by which data po
int\, and\nthere is a potentially infinite array o
f features. We derive the distribution over unbou
nded binary matrices by taking the limit of a dist
ribution over $N \\times K$ binary matrices as $K
\\rightarrow \\infty$\, a strategy inspired by the
derivation of the Chinese\nrestaurant process (Al
dous\, 1985\; Pitman\, 2002) which preserves excha
ngeability of the rows. We define a simple generat
ive processes for this distribution which we call
the Indian buffet process (IBP\; Griffiths and Gha
hramani\, 2005). We describe recent extensions of
this model\, Markov chain Monte Carlo algorithms
for inference\, and a number of applications to c
ollaborative filtering\, independent components an
alysis\, bioinformatics\, cognitive modelling\, an
d causal discovery.\n\nJoint work with Thomas L. G
riffiths (UC Berkeley). \n\n
LOCATION:TCM Seminar Room\, Cavendish Laboratory\, Departme
nt of Physics
CONTACT:Christian Steinruecken
END:VEVENT
END:VCALENDAR