A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization is proposed, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors for better satisfaction of previous quasi-Newton conditions. In comparison with , where a similar approach is used, correction vectors from more previous iterations can be applied here. For quadratic objective functions, the improvement of convergence is the best one in some sense, all stored corrected difference vectors are conjugate and the quasi-Newton conditions with these vectors are satisfied. Global convergence of the algorithm is established for convex sufficiently smooth functions. Numerical experiments demonstrate the e±ciency of the new method.
Technical report No. V 1203. Institute of Computer Science, Academy of Sciences of the Czech Republic. Prague, March 2014.