Two limited-memory optimization methods with minimum violation of the previous quasi-Newton equations

Limited-memory variable metric methods based on the well-known BFGS update are widely used for large scale optimization. The block version of the BFGS update, derived by Schnabel (1983), Hu and Storey (1991) and Vl·cek and Luk·san (2019), satis¯es the quasi-Newton equations with all used di®erence vectors and for quadratic objective functions gives the best improvement … Read more

A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions

To improve the performance of the limited-memory variable metric L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed in [1, 2]. But the suitable extra updates need to be selected carefully, since the repeating process can be time consuming. We show that for the limited-memory variable metric BNS method, matrix … Read more

Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization.

A block version of the BFGS variable metric update formula and its modifications are investigated. In spite of the fact that this formula satisfies the quasi-Newton conditions with all used difference vectors and that the improvement of convergence is the best one in some sense for quadratic objective functions, for general functions it does not … Read more

Handling Nonpositive Curvature in a Limited Memory Steepest Descent Method

We propose a limited memory steepest descent (LMSD) method for solving unconstrained optimization problems. As a steepest descent method, the step computation in each iteration requires the evaluation of a gradient of the objective function and the calculation of a scalar step size only. When employed to solve certain convex problems, our method reduces to … Read more

A modified limited-memory BNS method for unconstrained minimization based on the conjugate directions idea

A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization is proposed, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors for better satisfaction of previous quasi-Newton conditions. In comparison with [16], where a similar approach is used, correction vectors from more previous iterations … Read more

A Low-Memory Approach For Best-State Estimation Of Hidden Markov Models With Model Error

We present a low-memory approach for the best-state estimate (data assimilation) of hidden Markov models where model error is considered. In particular, our findings apply for the 4D- Var framework. The novelty of our approach resides in the fact that the storage needed by our estimation framework, while including model error, is dramatically reduced from … Read more

Modifications of the limited-memory BNS method for better satisfaction of previous quasi-Newton conditions

Several modifications of the limited-memory variable metric BNS method for large scale unconstrained optimization are proposed, which consist in corrections (derived from the idea of conjugate directions) of the used di®erence vectors to improve satisfaction of previous quasi-Newton conditions, utilizing information from previous or subsequent iterations. In case of quadratic objective functions, conjugacy of all … Read more

A conjugate directions approach to improve the limited-memory BFGS method

Simple modifiations of the limited-memory BFGS method (L-BFGS) for large scale unconstrained optimization are considered, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors, utilizing information from the preceding iteration. In case of quadratic objective functions, the improvement of convergence is the best one in some sense and … Read more

Generalizations of the limited-memory BFGS method based on quasi-product form of update

Two families of limited-memory variable metric or quasi-Newton methods for unconstrained minimization based on quasi-product form of update are derived. As for the first family, four variants how to utilize the Strang recurrences for the Broyden class of variable metric updates are investigated; three of them use the same number of stored vectors as the … Read more

Transformations enabling to construct limited-memory Broyden class methods

The Broyden class of quasi-Newton updates for inverse Hessian approximation are transformed to the formal BFGS update, which makes possible to generalize the well-known Nocedal method based on the Strang recurrences to the scaled limited-memory Broyden family, using the same number of stored vectors as for the limited-memory BFGS method. Two variants are given, the … Read more