Two efficient gradient methods with approximately optimal stepsizes based on regularization models for unconstrained optimization

It is widely accepted that the stepsize is of great significance to gradient method. Two efficient gradient methods with approximately optimal stepsizes mainly based on regularization models are proposed for unconstrained optimization. More exactly, if the objective function is not close to a quadratic function on the line segment between the current and latest iterates, regularization models are exploited carefully to generate approximately optimal stepsizes. Otherwise, quadratic approximation models are used. In particular, when the curvature is non-positive, special regularization models are developed. The convergence of the proposed methods is established under the weak conditions. Extensive numerical experiments indicated the proposed method is superior to the BBQ method (SIAM J. Optim. 2021,31(4), 3068-3096) and other efficient gradient methods, and is competitive to two famous and efficient conjugate gradient software packages CG$ \_ $DESCENT (5.0) (SIAM J. Optim. 16(1), 170-192, 2005) and CGOPT (1.0) (SIAM J. Optim. 23(1), 296-320, 2013) for the CUTEr library. Due to the surprising efficiency, we believe that gradient methods with approximately optimal stepsizes can become strong candidates for large-scale unconstrained optimization.

Article

Download

View PDF