A unified framework for inexact adaptive stepsizes in the gradient methods, the conjugate gradient methods and the quasi-Newton methods for strictly convex quadratic optimization

The inexact adaptive stepsizes for the conjugate gradient method and  the quasi-Newton method are very rare. The exact stepsizes in the gradient method, the conjugate gradient method and the  quasi-Newton method for strictly convex quadratic optimization have a unified framework, while the unified framework for inexact adaptive stepsizes  in the gradient method, the conjugate gradient method and the quasi-Newton method for strictly convex quadratic optimization  still remains unknown.   Based on the above observations,   we propose a  unified framework for  inexact adaptive stepsizes in the gradient method, the conjugate gradient method  and the quasi-Newton method  for strictly convex quadratic optimization, which is called approximately optimal stepsize. The global convergence and the convergence rate of the gradient method with the approximately optimal stepsize are established by exploring  the relation between the approximately optimal stepsize and the famous Barzilai-Borwein (BB) stepsizes.   Some  numerical results are presented, which confirm   the remarkable  numerical advantage of the gradient method, the conjugate gradient method and the quasi-Newton method with the unified framework for  inexact adaptive stepsizes. Some open problems about the gradient method, the conjugate gradient method and the  quasi-Newton method with approximately optimal stepsize  are raised.

Article

Download

View PDF