University of Cambridge > Talks.cam > Wednesday Seminars - Department of Computer Science and Technology  > Graph Neural Networks through the lens of algebraic topology, differential geometry, and PDEs

Graph Neural Networks through the lens of algebraic topology, differential geometry, and PDEs

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Ben Karniely.

The message-passing paradigm has been the “battle horse” of deep learning on graphs for several years, making graph neural networks a big success in a wide range of applications, from particle physics to protein design. From a theoretical viewpoint, it established the link to the Weisfeiler-Lehman hierarchy, allowing to analyse the expressive power of GNNs. I argue that the very “graph-centric” mindset of current graph deep learning schemes may hinder future progress in the field. As an alternative, I propose physics-inspired “continuous” learning models that open up a new trove of tools from the fields of differential geometry, algebraic topology, and differential equations so far largely unexplored in graph ML.

We are asking those attending in person to please take a lateral flow test with a negative result before the talk.

Link to join virtually: https://cl-cam-ac-uk.zoom.us/j/97767639783?pwd=T09GcVJxZUNEUFEvRnZnbWwxeEwzQT09

A recording of this talk is available at the following link: https://www.cl.cam.ac.uk/seminars/wednesday/video/

This talk is part of the Wednesday Seminars - Department of Computer Science and Technology series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity