The Structure-Adaptive Acceleration of Stochastic Proximal Gradient Algorithms
- đ¤ Speaker: Junqi Tang, University of Edinburgh đ Website
- đ Date & Time: Tuesday 17 September 2019, 14:00 - 15:00
- đ Venue: MR 14, Centre for Mathematical Sciences
Abstract
Stochastic gradient methods have become the de-facto techniques in data science, signal processing and machine learning, due to their computational efficiency in large-scale optimization problems. Throughout the past few years, accelerated stochastic gradient algorithms are extensively studied and developed, which are not only excellent numerically, but also worse-case optimal theoretically for convex and smooth objective functions. In many real-world applications, we often consider composite optimization tasks where non-smooth regularizers are used for better estimation or generalization. Such regularizers usually enforce the solutions to have low-dimensional structure, such as sparsity, group-sparsity, low-rank and piece-wise smoothness. In this talk, we present structure-adaptive variants of randomized optimization algorithms, including accelerated variance-reduced SGD , and accelerated proximal coordinate descent, for more efficiently solving large-scale composite optimization problems. These algorithms are tailored to exploit the low-dimensional structure of the solution, by judiciously designed restart schemes according to restricted strong-convexity property of the objective function due to non-smooth regularization. The convergence analysis demonstrates that our approach leads to provably improved iteration complexity, while we also validate the efficiency of our algorithms numerically on large-scale sparse regression tasks.
Series This talk is part of the Cambridge Image Analysis Seminars series.
Included in Lists
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)



Tuesday 17 September 2019, 14:00-15:00