University of Cambridge > Talks.cam > Machine Learning @ CUED > Unbiased Estimation of the Eigenvalues of Large Implicit Matrices

Unbiased Estimation of the Eigenvalues of Large Implicit Matrices

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Pat Wilson.

Many important problems are characterized by the eigenvalues of a large matrix. For example, the difficulty of many optimization problems, such as those arising from the fitting of large models in statistics and machine learning, can be investigated via the spectrum of the Hessian of the empirical loss function. Network data can be understood via the eigenstructure of the Laplacian matrix through spectral graph theory. Quantum simulations and other many-body problems are often characterized via the eigenvalues of the solution space, as are various dynamic systems. However, naive eigenvalue estimation is computationally expensive even when the matrix can be represented; in many of these situations the matrix is so large as to only be available implicitly via products with vectors. Even worse, one may only have noisy estimates of such matrix vector products. In this talk I will discuss how several different randomized techniques can be combined into a single procedure for unbiased estimates of the spectral density of large implicit matrices in the presence of noise.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2017 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity