Globally linearly convergent nonlinear conjugate gradients without Wolfe line search

This paper introduces a new nonlinear conjugate gradient method using any efficient line search method. Unless function values diverge to $-\infty$, global convergence to a stationary point is proved for continuously differentiable objective functions with Lipschitz continuous gradient, and global linear convergence if this stationary point is a strong local minimizer. Complexity bounds are given for the number of function evaluations for approximating a stationary point.

Article

Download

View Globally linearly convergent nonlinear conjugate gradients without Wolfe line search