Uncertainty and Learning in Spoken Human-Computer Dialogue
- đ¤ Speaker: Blaise Thomson, CUED MIL
- đ Date & Time: Monday 10 March 2008, 13:00 - 14:00
- đ Venue: LR5, Engineering Department, Baker Building
Abstract
In any spoken dialogue with a computer both speech recognition and semantic processing errors cause significant decreases in performance. Recent work has suggested the Partially Observable Markov Decision Process (POMDP) as a method for overcoming these difficulties. The POMDP model is able to capture the uncertainty inherent in dialogue and also provides a mechanism for the system to adapt and learn what to say in which situation. While effective on small problems the POMDP approach has struggled to scale to real world dialogues. This talk introduces an approach based on the POMDP model which does scale. Bayesian Networks are used to implement efficient belief updates and special function approximation techniques with gradient based learning provide an effective learning algorithm. Simulations show that the proposed framework outperforms standard techniques whenever errors increase.
Series This talk is part of the Machine Intelligence Laboratory Speech Seminars series.
Included in Lists
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- CUED Speech Group Seminars
- Guy Emerson's list
- Information Engineering Division seminar list
- LR5, Engineering Department, Baker Building
- Machine Intelligence Laboratory Speech Seminars
- PhD related
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Blaise Thomson, CUED MIL
Monday 10 March 2008, 13:00-14:00