A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression

Nonlinear conjugate gradients are among the most popular techniques for solving continuous optimization problems. Although these schemes have long been studied from a global convergence standpoint, their worst-case complexity properties have yet to be fully understood, especially in the nonconvex setting. In particular, it is unclear whether such methods possess better guarantees than first-order methods such as gradient descent. On the other hand, recent results have shown good performance of standard nonlinear conjugate gradient methods on nonconvex problems, even when compared with methods endowed with the best known complexity guarantees. In this paper, we propose a nonlinear conjugate gradient method based on a simple line-search paradigm and a modified restart condition. These two ingredients allow for monitoring the properties of the search directions, which is instrumental in obtaining complexity guarantees. Our complexity results illustrate the possible discrepancy between nonlinear conjugate gradient methods and classical gradient descent. Numerical experiments on nonconvex robust regression problems suggest that the restarting condition has little impact over the practical behavior.

Citation

R. Chan--Renous-Legoubin and C. W. Royer, A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression. Technical report arXiv:2201.08568, January 2022.

Article

Download

View A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression