Efficient Quadratic Corrections for Frank-Wolfe Algorithms

5
Citations
#654
in NeurIPS 2025
of 5858 papers
5
Authors
4
Data Points

Abstract

We develop a Frank-Wolfe algorithm with corrective steps, generalizing previous algorithms including blended conditional gradients, blended pairwise conditional gradients, and fully-corrective Frank-Wolfe. For this, we prove tight convergence guarantees together with an optimal face identification property. Furthermore, we propose two highly efficient corrective steps for convex quadratic objectives based on linear optimization or linear system solving, akin to Wolfe's minimum-norm point, and show that they converge in finite time under suitable conditions. Beyond optimization problems that are directly quadratic, we revisit two algorithms - split conditional gradient and second-order conditional gradient sliding - which can leverage quadratic corrections to accelerate their quadratic subproblems. We demonstrate improved convergence rates for the first and broader applicability for the second, which may be of independent interest. Finally, we show substantial computational speedups for Frank-Wolfe-based algorithms with quadratic corrections across the considered problem classes.

Citation History

Jan 26, 2026
0
Jan 27, 2026
0
Jan 27, 2026
0
Feb 1, 2026
5+5