BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Work in progress: Diving deeper into building distributed represen
 tations graphs - Paul Scherer
DTSTART:20191203T130000Z
DTEND:20191203T140000Z
UID:TALK132430@talks.cam.ac.uk
CONTACT:Mateja Jamnik
DESCRIPTION:A fundamental prerequisite for machine learning algorithms to 
 learn about input data is the ability to discern one observation from anot
 her. In most cases this requires explicit transformation of observations i
 nto feature vector representations that may be used as inputs into machine
  learning algorithms. The transformation of an observation to another form
  (and from stage to other representations) for input into learning systems
  is a crucial stage which incorporates various assumptions about we have m
 ade about the observations and the desired behaviour of the learning syste
 m. This talk will focus on building representations of graphs\, starting f
 rom an assumption we make about data\, interpreting this assumption and fo
 rmulating a system to learning distributed representations of graphs with 
 popular neural embedding methods. From there I will dive deeper into what 
 the neural embedding method is doing and characterising the construction o
 f such embeddings based on the association between a graph and its induced
  substructures.
LOCATION:SS03\, Computer Laboratory\, William Gates Building
END:VEVENT
END:VCALENDAR
