We describe three algorithms for solving differentiable convex optimization problems constrained to simple sets in $ \R^n $, i.e., sets on which it is easy to project an arbitrary point. The first two algorithms are optimal in the sense that they achieve an absolute precision of $ \varepsilon $ in relation to the optimal value in $ O(1/\sqrt\varepsilon) $ iterations using only first order information. This complexity depends on a Lipschitz constant $ L $ for the function derivatives and on a strong convexity constant $ \mu\geq 0 $. The first algorithm extends to the constrained case a well-known method devised by Nesterov \cite{nesterov1} for unconstrained problems, and includes a procedure guessing the unknown value of $ L $. The complexity analysis follows a simpler geometric approach. The other algorithms have several enhancements, including line searches that improve the performance: the second algorithm is enhanced and optimal; the third relaxes somewhat the optimality to obtain the best practical performance. Numerical tests for box-constrained quadratic problems are presented in the last section.
Citation
Technical Report, Federal University of Paraná, Dep. of Mathematics, Brazil, June, 2011.
Article
View AN OPTIMAL ALGORITHM FOR CONSTRAINED DIFFERENTIABLE CONVEX OPTIMIZATION