Calculus of Variations and Geometric Measure Theory

K. Bredies - M. Carioni - S. Fanzon - D. Walter

Asymptotic linear convergence of Fully-Corrective Generalized Conditional Gradient methods

created by fanzon on 15 Oct 2021
modified on 13 Jul 2023

[BibTeX]

Published Paper

Inserted: 15 oct 2021
Last Updated: 13 jul 2023

Journal: Mathematical Programming
Year: 2023
Doi: 10.1007/s10107-023-01975-z

ArXiv: 2110.06756 PDF

Abstract:

We propose a fully-corrective generalized conditional gradient method (FC-GCG) for the minimization of the sum of a smooth, convex loss function and a convex one-homogeneous regularizer over a Banach space. The algorithm relies on the mutual update of a finite set $\mathcal{A}_k$ of extremal points of the unit ball of the regularizer and of an iterate $u_k \in \operatorname{cone}(\mathcal{A}_k)$. Each iteration requires the solution of one linear problem to update $\mathcal{A}_k$ and of one finite dimensional convex minimization problem to update the iterate. Under standard hypotheses on the minimization problem we show that the algorithm converges sublinearly to a solution. Subsequently, imposing additional assumptions on the associated dual variables, this is improved to a linear rate of convergence. The proof of both results relies on two key observations: First, we prove the equivalence of the considered problem to the minimization of a lifted functional over a particular space of Radon measures using Choquet's theorem. Second, the FC-GCG algorithm is connected to a Primal-Dual-Active-point Method (PDAP) on the lifted problem for which we finally derive the desired convergence rates.


Download: