On the extension of the Hager-Zhang conjugate gradient method for vector optimization

The extension of the Hager-Zhang (HZ) nonlinear conjugate gradient method for vector optimization is discussed in the present research. In the scalar minimization case, this method generates descent directions whenever, for example, the line search satisfies the standard Wolfe conditions. We first show that, in general, the direct extension of the HZ method for vector optimization does not yield descent (in the vector sense) even when an exact line search is employed. By using a sufficiently accurate line search, we then propose a self-adjusting HZ method which possesses the descent property. The proposed HZ method with suitable parameters reduces to the classical one in the scalar minimization case. Global convergence of the new scheme is proved without regular restarts and any convex assumption. Finally, numerical experiments illustrating the practical behavior of the approach are presented, and comparisons with the Hestenes-Stiefel conjugate gradient method are discussed.

Citation

M. L. N. Gonçalves and L. F. Prudente, On the extension of the Hager-Zhang conjugate gradient method for vector optimization, technical report, 2018.