In this talk we introduce a Frank-Wolfe-type algorithm for sparse optimization in Banach spaces. The functional we want to optimize consist of the sum of a smooth fidelity term and of a convex one-homogeneous regularizer. We exploit the sparse structure of the variational problem by designing iterates as linear combinations of extremal points of the unit ball of the regularizer. For such iterates we prove global sublinear convergence of the algorithm. Then, under additional structural assumptions, we prove a local linear convergence rate. We apply this algorithm to the problem of particles tracking from heavily undersampled MRI data. This talk is based on the works cited below.
[1] K. Bredies, M. Carioni, S. Fanzon, D. Walter. Asymptotic linear convergence of Fully-Corrective Generalized Conditional Gradient methods. Mathematical Programming, 2023
[2] K. Bredies, S. Fanzon. An optimal transport approach for solving dynamic inverse problems in spaces of measures. ESAIM:M2AN, 54(6): 2351-2382, 2020
[3] K. Bredies, M. Carioni, S. Fanzon, F. Romero. A Generalized Conditional Gradient Method for Dynamic Inverse Problems with Optimal Transport Regularization. Found Comput Math, 2022
[4] K. Bredies, M. Carioni, S. Fanzon. On the extremal points of the ball of the Benamou–Brenier energy. Bull. London Math. Soc., 53: 1436-1452, 2021
[5] K. Bredies, M. Carioni, S. Fanzon. A superposition principle for the inhomogeneous continuity equation with Hellinger–Kantorovich-regular coefficients. Communications in Partial Differential Equations, 47(10): 2023-2069, 2022