An Inexact Restoration Direct Multisearch Filter Approach to Multiobjective Constrained Derivative-free Optimization

Direct Multisearch (DMS) is a well-established class of methods for multiobjective derivative-free optimization, where constraints are addressed by an extreme barrier approach, only evaluating feasible points. In this work, we propose a filter approach, combined with an inexact feasibility restoration step, to address constraints in the DMS framework. The filter approach treats feasibility as an … Read more

A filter sequential adaptive cubic regularisation algorithm for nonlinear constrained optimization

In this paper, we propose a filter sequential adaptive regularisation algorithm using cubics (ARC) for solving nonlinear equality constrained optimization. Similar to sequential quadratic programming methods, an ARC subproblem with linearized constraints is considered to obtain a trial step in each iteration. Composite step methods and reduced Hessian methods are employed to tackle the linearized … Read more

Global convergence of a derivative-free inexact restoration filter algorithm for nonlinear programming

In this work we present an algorithm for solving constrained optimization problems that does not make explicit use of the objective function derivatives. The algorithm mixes an inexact restoration framework with filter techniques, where the forbidden regions can be given by the flat or slanting filter rule. Each iteration is decomposed in two independent phases: … Read more

A Filter Active-Set Trust-Region Method

We develop a new active-set method for nonlinear programming problems that solves a regularized linear program to predict the active set and then fixes the active constraints to solve an equality-constrained quadratic program for fast convergence. Global convergence is promoted through the use of a filter. We show that the regularization parameter fulfills the same … Read more

Global convergence of slanting filter methods for nonlinear programming

In this paper we present a general algorithm for nonlinear programming which uses a slanting filter criterion for accepting the new iterates. Independently of how these iterates are computed, we prove that all accumulation points of the sequence generated by the algorithm are feasible. Computing the new iterates by the inexact restoration method, we prove … Read more

A Brief History of Filter Methods

We consider the question of global convergence of iterative methods for nonlinear programming problems. Traditionally, penalty functions have been used to enforce global convergence. In this paper we review a recent alternative, so-called filter methods. Instead of combing the objective and constraint violation into a single function, filter methods view nonlinear optimization as a biobjective … Read more

Global and finite termination of a two-phase augmented Lagrangian filter method for general quadratic programs

We present a two-phase algorithm for solving large-scale quadratic programs (QPs). In the first phase, gradient-projection iterations approximately minimize an augmented Lagrangian function and provide an estimate of the optimal active set. In the second phase, an equality-constrained QP defined by the current inactive variables is approximately minimized in order to generate a second-order search … Read more

A filter-trust-region method for unconstrained optimization

A new filter-trust-region algorithm for solving unconstrained nonlinear optimization problems is introduced. Based on the filter technique introduced by Fletcher and Leyffer, it extends an existing technique of Gould, Leyffer and Toint (SIAM J. Optim., to appear 2004) for nonlinear equations and nonlinear least-squares to the fully general unconstrained optimization problem. The new algorithm is … Read more