A Sequential Quadratic Programming Algorithm for Nonconvex, Nonsmooth Constrained Optimization

We consider optimization problems with objective and constraint functions that may be nonconvex and nonsmooth. Problems of this type arise in important applications, many having solutions at points of nondifferentiability of the problem functions. We present a line search algorithm for situations when the objective and constraint functions are locally Lipschitz and continuously differentiable on … Read more

A Line Search Exact Penalty Method Using Steering Rules

Line search algorithms for nonlinear programming must include safeguards to enjoy global convergence properties. This paper describes an exact penalization approach that extends the class of problems that can be solved with line search SQP methods. In the new algorithm, the penalty parameter is adjusted at every iteration to ensure sufficient progress in linear feasibility … Read more

A second derivative SQP method: local convergence

Gould and Robinson (NAR 08/18, Oxford University Computing Laboratory, 2008) gave global convergence results for a second-derivative SQP method for minimizing the exact $\ell_1$-merit function for a \emph{fixed} value of the penalty parameter. To establish this result, we used the properties of the so-called Cauchy step, which was itself computed from the so-called predictor step. … Read more

A Sequential Quadratic Programming Algorithm with an Additional Equality Constrained Phase

A sequential quadratic programming (SQP) method is presented that aims to overcome some of the drawbacks of contemporary SQP methods. It avoids the difficulties associated with indefinite quadratic programming subproblems by defining this subproblem to be always convex. The novel feature of the approach is the addition of an equality constrained phase that promotes fast … Read more

A second derivative SQP method: theoretical issues

Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exact-Hessian SQP methods. In particular, the resulting quadratic programming (QP) subproblems are often nonconvex, and thus finding their global solutions may be … Read more

A SECOND DERIVATIVE SQP METHOD WITH IMPOSED DESCENT

Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exact-Hessian SQP methods. In particular, the resulting quadratic programming (QP) subproblems are often nonconvex, and thus finding their global solutions may be … Read more

Two theoretical results for sequential semidefinite programming

We examine the local convergence of a sequential semidefinite programming approach for solving nonlinear programs with nonlinear semidefiniteness constraints. Known convergence results are extended to slightly weaker second order sufficient conditions and the resulting subproblems are shown to have local convexity properties that imply a weak form of self-concordance of the barrier subproblems. Citation Preprint, … Read more

The Squared Slacks Transformation in Nonlinear Programming

We recall the use of squared slacks used to transform inequality constraints into equalities and several reasons why their introduction may be harmful in many algorithmic frameworks routinely used in nonlinear programming. Numerical examples performed with the sequential quadratic programming method illustrate those reasons. Citation Cahier du GERAD G-2007-62, Aug. 2007 Article Download View The … Read more

An Inexact SQP Method for Equality Constrained Optimization

We present an algorithm for large-scale equality constrained optimization. The method is based on a characterization of inexact sequential quadratic programming (SQP) steps that can ensure global convergence. Inexact SQP methods are needed for large-scale applications for which the iteration matrix cannot be explicitly formed or factored and the arising linear systems must be solved … Read more

A Brief History of Filter Methods

We consider the question of global convergence of iterative methods for nonlinear programming problems. Traditionally, penalty functions have been used to enforce global convergence. In this paper we review a recent alternative, so-called filter methods. Instead of combing the objective and constraint violation into a single function, filter methods view nonlinear optimization as a biobjective … Read more