Preconditioned Barzilai-Borwein Methods for Multiobjective Optimization Problems

Preconditioning is a powerful approach for solving ill-conditioned problems in optimization, where a preconditioning matrix is used to reduce the condition number and speed up the convergence of first-order method. Unfortunately, it is impossible to capture the curvature of all objective functions with a single preconditioning matrix in multiobjective optimization. Instead, second-order methods for multiobjective … Read more

Global convergence of a BFGS-type algorithm for nonconvex multiobjective optimization problems

We propose a modified BFGS algorithm for multiobjective optimization problems with global convergence, even in the absence of convexity assumptions on the objective functions. Furthermore, we establish the superlinear convergence of the method under usual conditions. Our approach employs Wolfe step sizes and ensures that the Hessian approximations are updated and corrected at each iteration … Read more

A structured quasi-Newton algorithm for optimizing with incomplete Hessian information

We present a structured quasi-Newton algorithm for unconstrained optimization problems that have unavailable second-order derivatives or Hessian terms. We provide a formal derivation of the well-known BFGS secant update formula that approximates only the missing Hessian terms, and we propose a line-search quasi-Newton algorithm based on a modification of Wolfe conditions that converges to first-order … Read more

Block BFGS Methods

We introduce a quasi-Newton method with block updates called Block BFGS. We show that this method, performed with inexact Armijo-Wolfe line searches, converges globally and superlinearly under the same convexity assumptions as BFGS. We also show that Block BFGS is globally convergent to a stationary point when applied to non-convex functions with bounded Hessian, and … Read more

Nonsmooth Optimization via BFGS

We investigate the BFGS algorithm with an inexact line search when applied to nonsmooth functions, not necessarily convex. We define a suitable line search and show that it generates a sequence of nested intervals containing points satisfying the Armijo and weak Wolfe conditions, assuming only absolute continuity. We also prove that the line search terminates … Read more

Behavior of BFGS with an Exact Line Search on Nonsmooth Examples

We investigate the behavior of the BFGS algorithm with an exact line search on nonsmooth functions. We show that it may fail on a simple polyhedral example, but that it apparently always succeeds on the Euclidean norm function, spiraling into the origin with a Q-linear rate of convergence; we prove this in the case of … Read more

Duality in quasi-newton methods and new variational characterizations of the DFP and BFGS updates

It is known that quasi-Newton updates can be characterized by variational means, sometimes in more than one way. This paper has two main goals. We first formulate variational problems appearing in quasi-Newton methods within the space of symmetric matrices. This simplies both their formulations and their subsequent solutions. We then construct, for the first time, … Read more

Variational Problems in Quasi-Newton Methods

It has been known since the early 1970s that the Hessian matrices in quasi-Newton methods can be updated by variational means, in several different ways. The usual formulation of these variational problems uses a coordinate system, and the symmetry of the Hessian matrices are enforced as explicit constraints. As a result, the variational problems seem … Read more

iNEOS : An Interactive Environment for Nonlinear Optimization

In this paper we describe iNEOS, an Internet-based environment which facilitates the solution of complex nonlinear optimization problems. It enables a user to easily invoke a remote optimization code without having to supply the model to be optimized. An interactive communication between client and server is established and maintainted using CORBA. We test the system … Read more