Efficient First-Order Methods for Linear Programming and Semidefinite Programming

We present a simple transformation of any linear program or semidefinite program into an equivalent convex optimization problem whose only constraints are linear equations. The objective function is defined on the whole space, making virtually all subgradient methods be immediately applicable. We observe, moreover, that the objective function is naturally ``smoothed,'' thereby allowing most first-order methods to be applied. We develop complexity bounds in the unsmoothed case for a particular subgradient method, and in the smoothed case for Nesterov's original ``optimal'' first-order method for smooth functions. We achieve the desired bounds on the number of iterations, $ O(1/ \epsilon^2) $ and $ O(1/ \epsilon) $, respectively. However, contrary to most of the literature on first-order methods, we measure error relatively, not absolutely. On the other hand, also unlike most of the literature, we require only the level sets to be bounded, not the entire feasible region to be bounded. Perhaps most surprising is that the transformation from a linear program or a semidefinite program is simple and so is the basic theory, and yet the approach has been overlooked until now, a blind spot. Once the transformation is realized, the remaining effort in establishing complexity bounds is mainly straightforward, by making use of various works of Nesterov.

Citation

arXiv:1409.5832

Article

Download

View Efficient First-Order Methods for Linear Programming and Semidefinite Programming