Lower Bounds for Linear Minimization Oracle Methods Optimizing over Strongly Convex Sets

We consider the oracle complexity of constrained convex optimization given access to a Linear Minimization Oracle (LMO) for the constraint set and a gradient oracle for the $L$-smooth, strongly convex objective. This model includes Frank-Wolfe methods and their many variants. Over the problem class of strongly convex constraint sets $S$, our main result proves that no such deterministic method can guarantee a final objective gap less than $\varepsilon$ in fewer than $\Omega(\sqrt{L\, \mathrm{diam}(S)^2/\varepsilon})$ iterations. Our lower bound matches, up to constants, the accelerated Frank-Wolfe theory of Garber and Hazan (2015). Together, these establish this as the optimal complexity for deterministic LMO methods over strongly convex constraint sets. Second, we consider optimization over $\beta$-smooth sets, finding that in the modestly smooth regime of $\beta=\Omega(1/\sqrt{\varepsilon})$, no complexity improvement for span-based LMO methods is possible against either compact convex sets or strongly convex sets.

Article

Download

View PDF