An Interior-Point Trust-Funnel Algorithm for Nonlinear Optimization

We present an interior-point trust-funnel algorithm for solving large-scale nonlinear optimization problems. The method is based on an approach proposed by Gould and Toint (Math Prog 122(1):155–196, 2010) that focused on solving equality constrained problems. Our method is similar in that it achieves global convergence guarantees by combining a trust-region methodology with a funnel mechanism, … Read more

A Regularized SQP Method with Convergence to Second-Order Optimal Points

Regularized and stabilized sequential quadratic programming methods are two classes of sequential quadratic programming (SQP) methods designed to resolve the numerical and theoretical difficulties associated with ill-posed or degenerate nonlinear optimization problems. Recently, a regularized SQP method has been proposed that provides a strong connection between augmented Lagrangian methods and stabilized SQP methods. The method … Read more

An Active-Set Quadratic Programming Method Based On Sequential Hot-Starts

A new method for solving sequences of quadratic programs (QPs) is presented. For each new QP in the sequence, the method utilizes hot-starts that employ information computed by an active-set QP solver during the solution of the first QP. This avoids the computation and factorization of the full matrices for all but the first problem … Read more

A Sequential Quadratic Optimization Algorithm with Rapid Infeasibility Detection

We present a sequential quadratic optimization (SQO) algorithm for nonlinear constrained optimization. The method attains all of the strong global and fast local convergence guarantees of classical SQO methods, but has the important additional feature that fast local convergence is guaranteed when the algorithm is employed to solve infeasible instances. A two-phase strategy, carefully constructed … Read more

A filter method with unified step computation for nonlinear optimization

We present a filter linesearch method for solving general nonlinear and nonconvex optimization problems. The method is of the filter variety, but uses a robust (always feasible) subproblem based on an exact penalty function to compute a search direction. This contrasts traditional filter methods that use a (separate) restoration phase designed to reduce infeasibility until … Read more

An Inexact Sequential Quadratic Optimization Algorithm for Nonlinear Optimization

We propose a sequential quadratic optimization method for solving nonlinear optimization problems with equality and inequality constraints. The novel feature of the algorithm is that, during each iteration, the primal-dual search direction is allowed to be an inexact solution of a given quadratic optimization subproblem. We present a set of generic, loose conditions that the … Read more

A GLOBALLY CONVERGENT STABILIZED SQP METHOD

Sequential quadratic programming (SQP) methods are a popular class of methods for nonlinearly constrained optimization. They are particularly effective for solving a sequence of related problems, such as those arising in mixed-integer nonlinear programming and the optimization of functions subject to differential equation constraints. Recently, there has been considerable interest in the formulation of \emph{stabilized} … Read more

Solving Mixed-Integer Nonlinear Programs by QP-Diving

We present a new tree-search algorithm for solving mixed-integer nonlinear programs (MINLPs). Rather than relying on computationally expensive nonlinear solves at every node of the branch-and-bound tree, our algorithm solves a quadratic approximation at every node. We show that the resulting algorithm retains global convergence properties for convex MINLPs, and we present numerical results on … Read more

Regularized Sequential Quadratic Programming

We present the formulation and analysis of a new sequential quadratic programming (\SQP) method for general nonlinearly constrained optimization. The method pairs a primal-dual generalized augmented Lagrangian merit function with a \emph{flexible} line search to obtain a sequence of improving estimates of the solution. This function is a primal-dual variant of the augmented Lagrangian proposed … Read more

A Sequential Quadratic Programming Algorithm for Nonconvex, Nonsmooth Constrained Optimization

We consider optimization problems with objective and constraint functions that may be nonconvex and nonsmooth. Problems of this type arise in important applications, many having solutions at points of nondifferentiability of the problem functions. We present a line search algorithm for situations when the objective and constraint functions are locally Lipschitz and continuously differentiable on … Read more