A speed up strategy for gradient methods

In this paper, we propose a new acceleration strategy for gradient-based methods applied to strictly convex Quadratic Programming (QP) problems. The strategy consists in performing, at selected iterations, minimization steps along alternative descent directions or even within low-dimensional affine subspaces. In particular, considering the contribution of the linear and quadratic part of the objective function could be useful in designing line searches in acceleration steps. We present numerical experiments to assess the impact of acceleration steps on the performance of different gradient methods. We examined randomly generated QP and box constrained QP test problems, designed to assess the algorithms under various conditions, such as matrix dimensions, condition numbers, and initialization strategies. Our experiments show that the use of acceleration steps in some Barzilai-Borwein methods significantly improves computational results. Moving to general minimization problems, the extension of our approach is not straightforward; in particular, it is not possible to directly extend the two-dimensional minimization phase. In this work, we take a first step in this direction by providing preliminary ideas for a possible extension of the accelerated algorithm to the minimization of general functions.

Article

Download

View PDF