A family of spectral gradient methods for optimization

We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the short Barzilai-Borwein (BB) stepsize and the long BB stepsize. It is shown that each member of the family shares certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as its special cases. We prove that the family of methods is $R$-superlinearly convergent for two-dimensional strictly convex quadratics. Moreover, the family is $R$-linearly convergent in the $n$-dimensional case. Numerical results of the family with different settings are presented, which demonstrate that the proposed family is promising.

Article

Download

View PDF