A unified framework for inexact adaptive stepsizes in the gradient methods, the conjugate gradient methods and the quasi-Newton methods for strictly convex quadratic optimization

The inexact adaptive stepsizes for the conjugate gradient method and  the quasi-Newton method are very rare. The exact stepsizes in the gradient method, the conjugate gradient method and the  quasi-Newton method for strictly convex quadratic optimization have a unified framework, while the unified framework for inexact adaptive stepsizes  in the gradient method, the conjugate gradient … Read more

Two efficient gradient methods with approximately optimal stepsizes based on regularization models for unconstrained optimization

It is widely accepted that the stepsize is of great significance to gradient method. Two efficient gradient methods with approximately optimal stepsizes mainly based on regularization models are proposed for unconstrained optimization. More exactly, if the objective function is not close to a quadratic function on the line segment between the current and latest iterates, … Read more