The Complexity of Large-scale Convex Programming under a Linear Optimization Oracle

This paper considers a general class of iterative optimization algorithms, referred to as linear-optimization-based convex programming (LCP) methods, for solving large-scale convex programming (CP) problems. The LCP methods, covering the classic conditional gradient (CG) method (a.k.a., Frank-Wolfe method) as a special case, can only solve a linear optimization subproblem at each iteration. In this paper, we first establish a series of new lower complexity bounds for the LCP methods to solve different classes of CP problems, including smooth, nonsmooth and certain saddle-point problems. We then formally establish the theoretical optimality or nearly optimality, in the large-scale case, for the CG method and its variants to solve different classes of CP problems. We also introduce several new optimal LCP methods, obtained by properly modifying Nesterov's accelerated gradient method, and demonstrate their possible advantages over the classic CG for solving certain classes of CP problems.

Citation

Technical Report, Department of Industrial and Systems Engineering, University of Florida, May 2013.

Article

Download

View The Complexity of Large-scale Convex Programming under a Linear Optimization Oracle