BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:PETs\, POTs\, and pitfalls: rethinking the protection of users aga
 inst machine learning - Carmela Troncoso (EPFL)
DTSTART:20200220T150000Z
DTEND:20200220T160000Z
UID:TALK129367@talks.cam.ac.uk
CONTACT:33450
DESCRIPTION:Abstract:\nIn a machine-learning dominated world\, users' digi
 tal interactions are monitored\, and scrutinized in order to enhance servi
 ces. These enhancements\, however\, may not always have the benefit and pr
 eferences of the users as a primary goal. Machine learning\, for instance\
 , can be used to learn users' demographics and interests in order to fuel 
 targeted advertisements\, regardless of people's privacy rights\; or to le
 arn bank customers'  behavioral patterns to optimize the monetary benefits
  of loans\, with disregard for discrimination. In other words\, machine le
 arning models may be adversarial in their goals and operation. Therefore\,
  adversarial machine learning techniques that are usually considered undes
 irable can be turned into robust protection mechanisms for users. In this 
 talk we discuss two protective uses of adversarial machine learning\, and 
 challenges for protection arising from the biases implicit in many machine
  learning models.\n \nBio:\nCarmela Troncoso is an Assistant Professor at 
 EPFL where she leads the Security and Privacy Engineering (SPRING) Laborat
 ory. Her research focuses on privacy protection\, with particular focus on
  developing systematic means to build privacy-preserving systems and evalu
 ate these system's information leakage.
LOCATION:Lecture Theatre 2\, Computer Laboratory
END:VEVENT
END:VCALENDAR
