A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization

We introduce a family of weighted conjugate-gradient-type methods, for strictly convex quadratic functions, whose parameters are determined by a minimization model based on a convex combination of the objective function and its gradient norm. This family includes the classical linear conjugate gradient method and the recently published delayed weighted gradient method as the extreme cases of the convex combination. The inner cases produce a merit function that offers a compromise between function-value reduction and stationarity which is convenient for real applications. We show that each one of the infinitely many members of the family exhibits q-linear convergence to the unique solution. Moreover, each one of them enjoys finite termination and an optimality property related to the combined merit function. In particular, we prove that if the $n\times n$ Hessian of the quadratic function has $p < n$ different eigenvalues, then each member of the family obtains the unique global minimizer in exactly $p$ iterations. Numerical results are presented that demonstrate that the proposed family is promising and exhibits a fast convergence behavior which motivates the use of preconditioning strategies, as well as its extension to the numerical solution of general unconstrained optimization problems.

Article

Download

View A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization