Simple modifiations of the limited-memory BFGS method (L-BFGS) for large scale unconstrained optimization are considered, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors, utilizing information from the preceding iteration. In case of quadratic objective functions, the improvement of convergence is the best one in some sense and all stored difference vectors are conjugate for unit stepsizes. Global convergence of algorithm is established for convex su±ciently smooth functions. Numerical experiments indicate that the new method often improves the L-BFGS method signifiantly.
Technical report No. V 1120. Institute of Computer Science, Academy of Sciences of the Czech Republic, Pod vodarenskou Vezi 2, 182 07 Prague 8, Czech Republic. September 2011.