BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Polynomial Sheaf Hypergraph Neural Network - Leo-Minh Kustermann (
 University of Cambridge)
DTSTART:20260217T180000Z
DTEND:20260217T190000Z
UID:TALK244870@talks.cam.ac.uk
CONTACT:Leo-Minh Kustermann
DESCRIPTION:recording: https://youtu.be/k2qGIOLUkfs\n\nOur architecture is
  based on a corrected formulation of the sheaf hypergraph Laplacian and sy
 stematically investigates the effects of polynomial degree\, diffusion-bas
 ed versus direct filtering formulations\, and structural constraints such 
 as stalk dimensionality.\n\nContrary to our initial hypothesis\, experimen
 ts on seven benchmark datasets show that polynomial multi-hop filtering do
 es not consistently outperform one-hop sheaf diffusion baselines\, particu
 larly on homophilic citation networks. While polynomial filters are theore
 tically expressive and capable of approximating arbitrary spectral respons
 es\, jointly learning all coefficients induces a highly non-convex optimis
 ation landscape\, leading to instability\, local minima\, and higher varia
 nce.\n\nThese results suggest that maximising expressivity through unconst
 rained polynomial parameterisation can undermine practical performance. We
  therefore advocate for structured learning strategies-such as frequency-b
 and decomposition\, spatial pooling\, or hybrid spectral–spatial designs
 -that retain expressive power while introducing inductive biases to stabil
 ise optimisation and improve generalisation on hypergraph-structured data.
LOCATION:Computer Laboratory\, William Gates Building\, FW11 And Remote
END:VEVENT
END:VCALENDAR
