University of Cambridge > Talks.cam > Signal Processing and Communications Lab Seminars > Dual-to-kernel learning with ideals

Dual-to-kernel learning with ideals

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Prof. Ramji Venkataramanan.

We propose a theory unifying kernel learning and symbolic algebraic methods. Kernel methods are a very popular class of algorithms employing kernel functions which allow to capture properties of the data in a very efficient way, representing them implicitly in the so-called feature space, the most prominent example being the kernel support vector machine. The main advantage of kernels is also their main downside: since the representation is implicit it has remained an open question what exactly the structures and features are which make the algorithms work.

Symbolic algebraic methods, on the other hand, are by construction structural and deal with the manipulation of explicit equations. So far, their theoretical complexity and intractable computational cost, such as for Gröbner basis computations, has prevented broad application to real-world learning and data analysis.

We show that kernel learning and symbolic algebra are inherently dual to each other, and we use this duality to combine the structure-awareness of algebraic methods with the efficiency and generality of kernels. The main idea lies in relating polynomial rings to feature space, and ideals to manifolds, then exploiting this generative-discriminative duality on kernel matrices. We illustrate this by proposing two algorithms, IPCA and AVICA , for simultaneous manifold and feature learning.

This talk is part of the Signal Processing and Communications Lab Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity