University of Cambridge > Talks.cam > Polynomial Sheaf Hypergraph Neural Networks > Polynomial Sheaf Hypergraph Neural Network

Polynomial Sheaf Hypergraph Neural Network

Download to your calendar using vCal

If you have a question about this talk, please contact Leo-Minh Kustermann .

recording: https://youtu.be/k2qGIOLUkfs

Our architecture is based on a corrected formulation of the sheaf hypergraph Laplacian and systematically investigates the effects of polynomial degree, diffusion-based versus direct filtering formulations, and structural constraints such as stalk dimensionality.

Contrary to our initial hypothesis, experiments on seven benchmark datasets show that polynomial multi-hop filtering does not consistently outperform one-hop sheaf diffusion baselines, particularly on homophilic citation networks. While polynomial filters are theoretically expressive and capable of approximating arbitrary spectral responses, jointly learning all coefficients induces a highly non-convex optimisation landscape, leading to instability, local minima, and higher variance.

These results suggest that maximising expressivity through unconstrained polynomial parameterisation can undermine practical performance. We therefore advocate for structured learning strategies-such as frequency-band decomposition, spatial pooling, or hybrid spectral–spatial designs-that retain expressive power while introducing inductive biases to stabilise optimisation and improve generalisation on hypergraph-structured data.

This talk is part of the Polynomial Sheaf Hypergraph Neural Networks series.

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity