BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Speech Seminars
SUMMARY:Lagrangian relaxation for inference in natural lan
guage processing - Michael Collins\, Columbia Univ
ersity
DTSTART;TZID=Europe/London:20110613T130000
DTEND;TZID=Europe/London:20110613T143000
UID:TALK31687AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/31687
DESCRIPTION:There has been a long history in combinatorial opt
imization of methods that exploit structure in com
plex problems\, using methods such as dual decompo
sition or Lagrangian relaxation. These methods lev
erage the observation that complex inference probl
ems can often be decomposed into efficiently solva
ble sub-problems. Thus far\, however\, these metho
ds are not widely used in NLP.\n\nIn this talk I'l
l describe recent work on inference algorithms for
NLP based on Lagrangian relaxation. In the first
part of the talk I'll describe work on non-project
ive parsing. In the second part of the talk I'll d
escribe an exact decoding algorithm for syntax-bas
ed statistical translation. If time permits\, I'll
also briefly describe algorithms for dynamic prog
ramming intersections (e.g.\, the intersection of
a PCFG and an HMM)\, and for phrase-based translat
ion.\n\nFor all of the problems that we consider\,
the resulting algorithms produce exact solutions\
, with certificates of optimality\, on the vast ma
jority of examples\; the algorithms are efficient
for problems that are either NP-hard (as is the ca
se for non-projective parsing\, or for phrase-base
d translation)\, or for problems that are solvable
in polynomial time using dynamic programming\, bu
t where the traditional exact algorithms are far t
oo expensive to be practical.\n\nWhile the focus o
f this talk is on NLP problems\, there are close c
onnections to inference methods\, in particular be
lief propagation\, for graphical models. Our work
was inspired by recent work that has used dual dec
omposition as an alternative to belief propagation
in Markov random fields.\n\nThis is joint work wi
th Yin-Wen Chang\, Tommi Jaakkola\, Terry Koo\, Sa
sha Rush\, and David Sontag.\n
LOCATION: Cambridge University Engineering Department\, Lec
ture Room 6
CONTACT:Bill Byrne
END:VEVENT
END:VCALENDAR