A Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search

In this paper, we seek the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method and propose a family of conjugate gradient methods for unconstrained optimization. An improved Wolfe line search is also proposed, which can avoid a numerical drawback of the Wolfe line search and guarantee the global convergence of the conjugate gradient method under mild conditions. To accelerate the algorithm, we develop an adaptive strategy to choose the initial stepsize and introduce dynamic restarts along negative gradients based on how the function is close to some quadratic function during some previous iterations. Numerical experiments with the CUTEr collection show that the proposed algorithm is promising.

Citation

Report, AMSS, Chinese Academy of Sciences, Beijing, October, 2010.

Article

Download

View PDF