To improve the performance of the limited-memory variable metric L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed in [1, 2]. But the suitable extra updates need to be selected carefully, since the repeating process can be time consuming. We show that for the limited-memory variable metric BNS method, matrix updating can be e±ciently repeated in¯nitely many times under some conditions, with only a small increase of the number of arithmetic operations. The limit variable metric matrix can be written as a block BFGS update [22], which can be obtained by solving of some low-order Lyapunov matrix equation. The resulting method can be advantageously combined with methods based on vector corrections for conjugacy, see e.g. [21]. Global convergence of the proposed algorithm is established for convex and su±ciently smooth functions. Numerical experiments demonstrate the e±ciency of the new method.
Citation
Technical report No. V 1245, Institute of Computer Science, Czech Academy of Sciences, Prague, March 2018