Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization.

A block version of the BFGS variable metric update formula and its modifications are investigated. In spite of the fact that this formula satisfies the quasi-Newton conditions with all used difference vectors and that the improvement of convergence is the best one in some sense for quadratic objective functions, for general functions it does not guarantee that the corresponding direction vectors are descent. To overcome this difficulty, but at the same time utilize the advantageous properties of the block BFGS update, a block version of the limited-memory variable metric BNS method for large scale unconstrained optimization is proposed. The global convergence of the algorithm is established for convex sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new method.

Citation

Research Report V1244-17, Institute of Computer Science, Czech Academy of Sciences, Prague 2017

Article

Download

View PDF