Deep (Inter-)Active Learning for NLP: Cure-all or Catastrophe?
- π€ Speaker: Zachary Chase Lipton, Carnegie Mellon University
- π Date & Time: Friday 24 January 2020, 15:00 - 16:00
- π Venue: Auditorium, Microsoft Research Ltd, 21 Station Road, Cambridge, CB1 2FB
Abstract
While deep learning produces supervised models with unprecedented predictive performance on many tasks, under typical training procedures, advantages over classical methods emerge only with large datasets. The extreme data-dependence of reinforcement learners may be even more problematic. Millions of experiences sampled from video-games come cheaply, but human-interacting systems canβt afford to waste so much labor. In this talk, I will discuss several efforts to increase the labor-efficiency of learning from human interactions. Specifically, I will cover work on learning dialogue policies, deep active learning for natural language processing, learning from noisy and singly-labeled data, and active learning with partial feedback. Finally, time permitting, Iβll discuss a new approach for reducing the reliance of NLP models on spurious associations in the data that relies on a new mechanism for interacting with annotators.
Series This talk is part of the Microsoft Research Cambridge, public talks series.
Included in Lists
- All Talks (aka the CURE list)
- Auditorium, Microsoft Research Ltd, 21 Station Road, Cambridge, CB1 2FB
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge talks
- Chris Davis' list
- Guy Emerson's list
- Interested Talks
- Microsoft Research Cambridge, public talks
- ndk22's list
- ob366-ai4er
- Optics for the Cloud
- personal list
- PMRFPS's
- rp587
- School of Technology
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Zachary Chase Lipton, Carnegie Mellon University
Friday 24 January 2020, 15:00-16:00