A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression

Nonlinear conjugate gradients are among the most popular techniques for solving continuous optimization problems. Although these schemes have long been studied from a global convergence standpoint, their worst-case complexity properties have yet to be fully understood, especially in the nonconvex setting. In particular, it is unclear whether nonlinear conjugate gradient methods possess better guarantees than first-order methods such as gradient descent. Meanwhile, recent experiments have shown impressive performance of standard nonlinear conjugate gradient techniques on certain nonconvex problems, even when compared with methods endowed with the best known complexity guarantees. In this paper, we propose a nonlinear conjugate gradient scheme based on a simple line-search paradigm and a modified restart condition. These two ingredients allow for monitoring the properties of the search directions, which is instrumental in obtaining complexity guarantees. Our complexity results illustrate the possible discrepancy between nonlinear conjugate gradient methods and classical gradient descent. A numerical investigation on nonconvex robust regression problems as well as a standard benchmark illustrate that the restarting condition can track the behavior of a standard implementation.

Citation

R. Chan--Renous-Legoubin and C. W. Royer, A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression. Technical report arXiv:2201.08568, January 2022.

Article

Download

View PDF