Inexact Restoration method for Derivative-Free Optimization with smooth constraints

A new method is introduced for solving constrained optimization problems in which the derivatives of the constraints are available but the derivatives of the objective function are not. The method is based on the Inexact Restoration framework, by means of which each iteration is divided in two phases. In the first phase one considers only … Read more

Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization

Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate … Read more

A short note on the global convergence of the unmodified PRP method

It is well-known that the search direction generated by the standard (unmodified) PRP nonlinear conjugate gradient method is not necessarily a descent direction of the objective function, which brings difficulty for its global convergence for general functions. However, to our surprise, it is easily proved in this short note that the unmodified PRP method still … Read more

Constrained Derivative-Free Optimization on Thin Domains

Many derivative-free methods for constrained problems are not efficient for minimizing functions on “thin” domains. Other algorithms, like those based on Augmented Lagrangians, deal with thin constraints using penalty-like strategies. When the constraints are computationally inexpensive but highly nonlinear, these methods spend many potentially expensive objective function evaluations motivated by the difficulties of improving feasibility. … Read more

A Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search

In this paper, we seek the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method and propose a family of conjugate gradient methods for unconstrained optimization. An improved Wolfe line search is also proposed, which can avoid a numerical drawback of the Wolfe line search and guarantee the global convergence of … Read more

A surrogate management framework using rigorous trust-regions steps

Surrogate models and heuristics are frequently used in the optimization engineering community as convenient approaches to deal with functions for which evaluations are expensive or noisy, or lack convexity. These methodologies do not typically guarantee any type of convergence under reasonable assumptions and frequently render slow convergence. In this paper we will show how to … Read more

A surrogate management framework using rigorous trust-regions steps

Surrogate models and heuristics are frequently used in the optimization engineering community as convenient approaches to deal with functions for which evaluations are expensive or noisy, or lack convexity. These methodologies do not typically guarantee any type of convergence under reasonable assumptions and frequently render slow convergence. In this paper we will show how to … Read more

A Dwindling Filter Line Search Method for Unconstrained Optimization

In this paper, we propose a new dwindling multidimensional filter second-order line search method for solving large-scale unconstrained optimization problems. Usually, the multidimensional filter is constructed with a fixed envelope, which is a strict condition for the gradient vectors. A dwindling multidimensional filter technique, which is a modification and improvement of the original multidimensional filter, … Read more

A Perry Descent Conjugate Gradient Method with Restricted Spectrum

A new nonlinear conjugate gradient method, based on Perry’s idea, is presented. And it is shown that its sufficient descent property is independent of any line search and the eigenvalues of $P_{k+1}^{\T}P_{k+1}$ are bounded above, where $P_{k+1}$ is the iteration matrix of the new method. Thus, the global convergence is proven by the spectral analysis … Read more

Solving structured nonlinear least-squares and nonlinear feasibility problems with expensive functions

We present an algorithm for nonlinear least-squares and nonlinear feasibility problems, i.e. for systems of nonlinear equations and nonlinear inequalities, which depend on the outcome of expensive functions for which derivatives are assumed to be unavailable. Our algorithm combines derivative-free techniques with filter trust-region methods to keep the number of expensive function evaluations low and … Read more