University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Inference for eigenstructure of high-dimensional covariance matrices

Inference for eigenstructure of high-dimensional covariance matrices

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact INI IT.

STS - Statistical scalability

Sparse principal component analysis (PCA) has become one of the most widely used techniques for dimensionality reduction in high-dimensional datasets. The main challenge underlying sparse PCA is to estimate the first vector of loadings of the population covariance matrix, provided that
only a certain number of loadings are non-zero. A vast number of methods have been proposed in literature for point estimation of eigenstructure of the covariance matrix. In this work, we study uncertainty quantification and propose methodology for inference and hypothesis testing for individual loadings and for the largest eigenvalue of the covariance matrix. We base our methodology on a Lasso-penalized M-estimator which, despite non-convexity, may be solved by a polynomial-time algorithm such as coordinate or gradient descent. Our results provide theoretical guarantees for asymptotic normality of the new estimators and may be used for valid hypothesis testing and variable selection. 



This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity