University of Cambridge > Talks.cam > Machine Learning @ CUED > Logistic Regression with a Laplacian prior on the Eigenvalues: Convex duality and application to EEG classification

Logistic Regression with a Laplacian prior on the Eigenvalues: Convex duality and application to EEG classification

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Zoubin Ghahramani.

Note different room and time

We propose a matrix coefficient logistic regression for the classification of single-trial ElectroEncephaloGraphy (EEG) signals. The method works in the feature space of all the variances and covariances between electrodes. The problem is formulated in a single convex optimization problem with the spectral $\ell_1$-regularization of the coefficient matrix. In addition, we propose an efficient optimization algorithm based on a simple interior-point method. The convex duality plays the key role in this implementation. Classification results on 162 Brain-Computer Interface (BCI) datasets show significant improvement in the classification accuracy against $\ell_2$-regularized logistic regression, rank=2 approximated logistic regression as well as Common Spatial Pattern (CSP) based classifier, which is a popular technique in BCI . Connections to LASSO , GP classification with a second order polynomial kernel, and SVM are discussed.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2020 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity