Globally linearly convergent nonlinear conjugate gradients without Wolfe line search

This paper introduces a new nonlinear conjugate gradient (CG) method using an efficient gradient-free line search method. Unless function values diverge to $-\infty$, global convergence to a stationary point is proved for continuously differentiable objective functions with Lipschitz continuous gradient, and global linear convergence if this stationary point is a strong local minimizer. The $n$-iterations termination of the method also is addressed for strictly convex quadratic functions posed in $R^n$. Moreover, $O(\epsilon^{-2})$ complexity bounds for the number of functions and gradient evaluations are derived for approximating a stationary point. These will be improved to $O(\log \epsilon^{-1})$ for objective functions having only a strong minimizer and no other stationary points. A measure for zigzagging strength is also introduced. Relying on this, a minimal zigzagging strength of the method is shown. Numerical results on the unconstrained CUTEst test problems illustrate that the new method is competitive with the best nonlinear state-of-the-art CG methods proposed in the literature.

Article

Download

View PDF