BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Graphons and Machine Learning: Modeling and Estimation of Sparse M
 assive Networks - Part II - Christian Borgs (Microsoft (UK))
DTSTART:20161212T143000Z
DTEND:20161212T153000Z
UID:TALK69452@talks.cam.ac.uk
CONTACT:INI IT
DESCRIPTION:There are numerous examples of sparse massive networks\, in pa
 rticular the Internet\, WWW and online social networks.&nbsp\; How do we m
 odel and learn these networks?&nbsp\; In contrast to conventional learning
  problems\, where we have many independent samples\, it is often the case 
 for these networks that we can get only one independent sample.&nbsp\; How
  do we use a single snapshot today to learn a model for the network\, and 
 therefore be able to predict a similar\, but larger network in the future?
 &nbsp\; In the case of relatively small or moderately sized networks\, it&
 rsquo\;s appropriate to model the network parametrically\, and attempt to 
 learn these parameters.&nbsp\; For massive networks\, a non-parametric rep
 resentation is more appropriate.&nbsp\; In this talk\, we first review the
  theory of graphons\, developed over the last decade to describe limits of
  dense graphs\, and the more the recent theory describing sparse graphs of
  unbounded average degree\, including power-law graphs.&nbsp\; We then sho
 w how to use these graphons as nonparametric models for sparse networks.&n
 bsp\; Finally\, we show how to get consistent estimators of these non-para
 metric models\, and moreover how to do this in a way that protects the pri
 vacy of individuals on the network.&nbsp\;   &nbsp\;  <br><span><br>Part I
  of this talk reviews the theory of graph convergence for dense and sparse
  graphs.&nbsp\; Part II uses the results of Part I to model and estimate s
 parse massive networks.</span>
LOCATION:Seminar Room 1\, Newton Institute
END:VEVENT
END:VCALENDAR
