University of Cambridge > > Statistics > Sparsity pattern aggregation for convex stochastic optimization.

Sparsity pattern aggregation for convex stochastic optimization.

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Richard Nickl.

Important statistical problems including regression, binary classification and density estimation can be recast as convex stochastic optimization problems when seen from the point of view of statistical aggregation. These convex problems can be numerically solved efficiently in high dimension but may show mediocre statistical performance. One way to overcome this situation consists in assuming that there exists approximate solution, called “sparse”, that are of moderate dimension. This presentation introduces a new method called “exponential screening (ES)” as an alternative to the $\ell_1$-penalization idea, which is currently the most popular way to find these sparse solutions. While $\ell_1$ based methods can be analyzed only under rather stringent assumptions, ES shows optimal statistical performance under fairly general assumptions. Implementation is not straightforward but it can be approximated using the Metropolis algorithm which results in a stochastic greedy algorithm and performs surprisingly well in a simulated problem of sparse recovery.

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2023, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity