A Limited Memory Subspace Minimization Conjugate Gradient Algorithm for Unconstrained Optimization

Subspace minimization conjugate gradient (SMCG) methods are a class of high potential iterative methods for unconstrained optimization. The orthogonality is an important property of linear conjugate gradient method. It is however observed that the orthogonality of gradients in linear conjugate gradient method is often lost, which usually causes the slow convergence of conjugate gradient method. Based on SMCG$ \_ $BB (J. Optim. Theory Appl. 180(3), 879-906, 2019), we combine the limited memory technique with subspace minimization conjugate gradient method and present a limited memory subspace minimization conjugate gradient algorithm for unconstrained optimization in this paper. The proposed method includes two types of iterations: SMCG iteration and quasi-Newton (QN) iteration. In the SMCG iteration, we determine the search direction by solving the quadratic approximation problem, in which the important parameter is estimated based on some properties of the objective function at the current iterative point. In the QN iteration, a modified quasi-Newton method in the subspace is exploited to improved the orthogonality. Additionally, a modified strategy for choosing the initial stepsize is exploited. The global convergence of the proposed method is established under weaker conditions. Some numerical results indicate that,for the CUTEr library, the proposed method has a great improvement over SMCG_BB, and is comparable to the latest limited memory conjugate gradient software package CG_DESCENT (6.8)(SIAM J. Optim. 23(4), 2150-2168, 2013) and is superior to the limited memory BFGS (L-BFGS) method.

Article

Download

View PDF