Limited-Memory BFGS with Displacement Aggregation

A displacement aggregation strategy is proposed for the curvature pairs stored in a limited-memory BFGS (a.k.a. L-BFGS) method such that the resulting (inverse) Hessian approximations are equal to those that would be derived from a full-memory BFGS method. This means that, if a sufficiently large number of pairs are stored, then an optimization algorithm employing … Read more

A Stochastic Quasi-Newton Method for Large-Scale Optimization

Abstract The question of how to incorporate curvature information in stochastic approximation methods is challenging. The direct application of classical quasi- Newton updating techniques for deterministic optimization leads to noisy curvature estimates that have harmful effects on the robustness of the iteration. In this paper, we propose a stochastic quasi-Newton method that is efficient, robust … Read more

A 2-BFGS updating in a trust region framework

We present a new matrix-free method for the trust region subproblem, assuming that the approximate Hessian is updated by the limited memory BFGS formula with m = 2. The resulting updating scheme, called 2-BFGS, give us the ability to determine via simple formulas the eigenvalues of the resulting approximation. Thus, at each iteration, we can … Read more