University of Cambridge > Talks.cam > Cambridge Image Analysis Seminars > Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization - Exact and Inexact Perspectives

Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization - Exact and Inexact Perspectives

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Jingwei Liang.

We propose a splitting scheme which hybridizes generalized conditional gradient with a proximal step, which we call CGALP algorithm, for minimizing the sum of three proper convex and lower-semicontinuous functions in real Hilbert spaces. The minimization is subject to an affine constraint, that allows in particular to deal with composite problems (sum of more than three functions) in a separate way by the usual product space technique. While classical conditional gradient methods require Lipschitz-continuity of the gradient of the differentiable part of the objective, CGALP needs only differentiability (on an appropriate subset), hence circumventing the intricate question of Lipschitz continuity of gradients. For the two remaining functions in the objective, we do not require any additional regularity assumption. The second function, possibly nonsmooth, is assumed simple, i.e., the associated proximal mapping is easily computable. For the third function, again nonsmooth, we just assume that its domain is weakly compact and that a linearly perturbed minimization oracle is accessible. Finally, the affine constraint is addressed by the augmented Lagrangian approach. We discuss both exact and inexact (stochastic) variants of the algorithm.

This talk is part of the Cambridge Image Analysis Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity