A Limited Memory Subspace Minimization Conjugate Gradient Algorithm for Unconstrained Optimization

Subspace minimization conjugate gradient (SMCG) methods are a class of high potential iterative methods for unconstrained optimization. The orthogonality is an important property of linear conjugate gradient method. It is however observed that the orthogonality of gradients in linear conjugate gradient method is often lost, which usually causes the slow convergence of conjugate gradient method. … Read more

A Limited Memory Steepest Descent Method

The possibilities inherent in steepest descent methods have been considerably amplified by the introduction of the Barzilai-Borwein choice of step-size, and other related ideas. These methods have proved to be competitive with conjugate gradient methods for the minimization of large dimension unconstrained minimization problems. This paper suggests a method which is able to take advantage … Read more

On a class of limited memory preconditioners for large scale linear systems with multiple right-hand sides

This work is concerned with the development and study of a class of limited memory preconditioners for the solution of sequences of linear systems. To this aim, we consider linear systems with the same symmetric positive definite matrix and multiple right-hand sides available in sequence. We first propose a general class of preconditioners, called Limited … Read more

New Variable Metric Methods for Unconstrained Minimization Covering the Large-Scale Case

A new family of numerically efficient variable metric or quasi-Newton methods for unconstrained minimization are given, which give simple possibility of adaptation for large-scale optimization. Global convergence of the methods can be established for convex sufficiently smooth functions. Some encouraging numerical experience is reported. Citation Report V876, Institute of Computer Science, AV CR, Pod Vodarenskou … Read more