Intermediate gradient methods for smooth convex problems with inexact oracle

Between the robust but slow (primal or dual) gradient methods and the fast but sensitive to errors fast gradient methods, our goal in this paper is to develop first-order methods for smooth convex problems with intermediate speed and intermediate sensitivity to errors. We develop a general family of first-order methods, the Intermediate Gradient Method (IGM), based on two sequences of coefficients. We prove that the behavior of such kind of method is directly governed by the choice of coefficients and that the existing dual and fast gradient methods can be retrieved with particular choices for the coefficients. Moreover, the degree of freedom in the choice of these coefficients can be also used in order to generate intermediate behaviors. We propose a switching policy for the coefficients that allows us to see the corresponding IGM as a smart switching between fast and dual gradient methods and to reach target accuracies, unreachable by the fast gradient methods, in a significantly smaller number of iterations compared to what is needed using the slow gradient methods. With another choice for the coefficients, we are also able to generate methods exhibiting the full spectrum of convergence rates, corresponding to every possible trade off between fastness of the method and robustness to errors.

Citation

CORE Discussion Paper 2013/17

Article

Download

View Intermediate gradient methods for smooth convex problems with inexact oracle