Extended Barzilai-Borwein method for unconstrained minimization problems

In 1988, Barzilai and Borwein presented a new choice of step size for the gradient method for solving unconstrained minimization problems. Their method aimed to accelerate the convergence of the steepest descent method. The Barzilai-Borwein method requires few storage locations and inexpensive computations. Therefore, several authors have paid attention to the Barzilai-Borwein method and have … Read more

Primal interior point method for minimization of generalized minimax functions

In this report, we propose a primal interior-point method for large sparse generalized minimax optimization. After a short introduction, where the problem is stated, we introduce the basic equations of the Newton method applied to the KKT conditions and propose a primal interior-point method. Next we describe the basic algorithm and give more details concerning … Read more

Multi-Secant Equations, Approximate Invariant Subspaces and Multigrid Optimization

New approximate secant equations are shown to result from the knowledge of (problem dependent) invariant subspace information, which in turn suggests improvements in quasi-Newton methods for unconstrained minimization. It is also shown that this type of information may often be extracted from the multigrid structure of discretized infinite dimensional problems. A new limited-memory BFGS using … Read more

A Retrospective Trust-Region Method for Unconstrained Optimization

We introduce a new trust-region method for unconstrained optimization where the radius update is computed using the model information at the current iterate rather than at the preceding one. The update is then performed according to how well the current model retrospectively predicts the value of the objective function at last iterate. Global convergence to … Read more

Adaptive cubic overestimation methods for unconstrained optimization

An Adaptive Cubic Overestimation (ACO) algorithm for unconstrained optimization is proposed, generalizing at the same time an unpublished method due to Griewank (Technical Report NA/12, 1981, DAMTP, Univ. of Cambridge), an algorithm by Nesterov & Polyak (Math. Programming 108(1), 2006, pp 177-205) and a proposal by Weiser, Deuflhard & Erdmann (Optim. Methods Softw. 22(3), 2007, … Read more

A 2-BFGS updating in a trust region framework

We present a new matrix-free method for the trust region subproblem, assuming that the approximate Hessian is updated by the limited memory BFGS formula with m = 2. The resulting updating scheme, called 2-BFGS, give us the ability to determine via simple formulas the eigenvalues of the resulting approximation. Thus, at each iteration, we can … Read more

Developments of NEWUOA for unconstrained minimization without derivatives

The NEWUOA software is described briefly, with some numerical results that show good efficiency and accuracy in the unconstrained minimization without derivatives of functions of up to 320 variables. Some preliminary work on an extension of NEWUOA that allows simple bounds on the variables is also described. It suggests a variation of a technique in … Read more

A view of algorithms for optimization without derivatives

Let the least value of a function of many variables be required. If its gradient is available, then one can tell whether search directions are downhill, and first order conditions help to identify the solution. It seems in practice, however, that the vast majority of unconstrained calculations do not employ any derivatives. A view of … Read more

The Speed of Shor’s R-Algorithm

Shor’s r-algorithm is an iterative method for unconstrained optimization, designed for minimizing nonsmooth functions, for which its reported success has been considerable. Although some limited convergence results are known, nothing seems to be known about the algorithm’s rate of convergence, even in the smooth case. We study how the method behaves on convex quadratics, proving … Read more

On large scale unconstrained optimization problems and higher order methods

Third order methods will in most cases use fewer iterations than a second order method to reach the same accuracy. However, the number of arithmetic operations per iteration is higher for third order methods than a second order method. Newton’s method is the most commonly used second order method and Halley’s method is the most … Read more