CONJUGATE GRADIENT WITH SUBSPACE OPTIMIZATION

In this paper we present a variant of the conjugate gradient (CG) algorithm in which we invoke a subspace minimization subproblem on each iteration. We call this algorithm CGSO for “conjugate gradient with subspace optimization”. It is related to earlier work by Nemirovsky and Yudin. We apply the algorithm to solve unconstrained strictly convex problems. As with other CG algorithms, the update step on each iteration is a linear combination of the last gradient and last update. Unlike some other conjugate gradient methods, our algorithm attains a theoretical complexity bound of $O(\sqrt{\frac{L}{l}}\log(\frac{1}{\epsilon})$, where the ratio $\frac{L}{l}$ characterizes the strong convexity of the objective function. In practice, CGSO competes with other CG-type algorithms by incorporating some second order information in each iteration.

Article

Download

View PDF