A Sequential Algorithm for Solving Nonlinear Optimization Problems with Chance Constraints

An algorithm is presented for solving nonlinear optimization problems with chance constraints, i.e., those in which a constraint involving an uncertain parameter must be satisfied with at least a minimum probability. In particular, the algorithm is designed to solve cardinality-constrained nonlinear optimization problems that arise in sample average approximations of chance-constrained problems, as well as … Read more

A Sparsity Preserving Convexification Procedure for Indefinite Quadratic Programs Arising in Direct Optimal Control

Quadratic programs (QP) with an indefinite Hessian matrix arise naturally in some direct optimal control methods, e.g. as subproblems in a sequential quadratic programming (SQP) scheme. Typically, the Hessian is approximated with a positive definite matrix to ensure having a unique solution; such a procedure is called \emph{regularization}. We present a novel regularization method tailored … Read more

A SQP type method for constrained multiobjective optimization

We propose an SQP type method for constrained nonlinear multiobjective optimization. The proposed algorithm maintains a list of nondominated points that is improved both for spread along the Pareto front and optimality by solving singleobjective constrained optimization problems. Under appropriate differentiability assumptions we discuss convergence to local optimal Pareto points. We provide numerical results for … Read more

On the Performance of SQP Methods for Nonlinear Optimization

This paper concerns some practical issues associated with the formulation of sequential quadratic programming (SQP) methods for large-scale nonlinear optimization. SQP methods find an approximate solution of a sequence of quadratic programming (QP) subproblems in which a quadratic model of the objective function is minimized subject to the linearized constraints. Extensive numerical results are given … Read more

A Filter SQP Method: Local Convergence and Numerical Results

The work by Gould, Loh, and Robinson [“A filter method with unified step computation for nonlinear optimization”, SIAM J. Optim., 24 (2014), pp. 175–209] established global convergence of a new filter line search method for finding local first-order solutions to nonlinear and nonconvex constrained optimization problems. A key contribution of that work was that the … Read more

A Filter Active-Set Algorithm for Ball/Sphere Constrained Optimization Problem

In this paper, we propose a filter active-set algorithm for the minimization problem over a product of multiple ball/sphere constraints. By making effective use of the special structure of the ball/sphere constraints, a new limited memory BFGS (L-BFGS) scheme is presented. The new L-BFGS implementation takes advantage of the sparse structure of the Jacobian of … Read more

Majorization-minimization procedures and convergence of SQP methods for semi-algebraic and tame programs

In view of solving nonsmooth and nonconvex problems involving complex constraints (like standard NLP problems), we study general maximization-minimization procedures produced by families of strongly convex sub-problems. Using techniques from semi-algebraic geometry and variational analysis –in particular Lojasiewicz inequality– we establish the convergence of sequences generated by this type of schemes to critical points. The … Read more

A Globally Convergent Stabilized SQP Method: Superlinear Convergence

Regularized and stabilized sequential quadratic programming (SQP) methods are two classes of methods designed to resolve the numerical and theoretical difficulties associated with ill-posed or degenerate nonlinear optimization problems. Recently, a regularized SQP method has been proposed that allows convergence to points satisfying certain second-order KKT conditions (SIAM J. Optim., 23(4):1983–2010, 2013). The method is … Read more

AN INEQUALITY-CONSTRAINED SQP METHOD FOR EIGENVALUE OPTIMIZATION

We consider a problem in eigenvalue optimization, in particular find- ing a local minimizer of the spectral abscissa – the value of a parameter that results in the smallest magnitude of the largest real part of the spectrum of a matrix system. This is an important problem for the stabilization of control sys- tems. Many … Read more

SQP Methods for Parametric Nonlinear Optimization

Sequential quadratic programming (SQP) methods are known to be effi- cient for solving a series of related nonlinear optimization problems because of desirable hot and warm start properties–a solution for one problem is a good estimate of the solution of the next. However, standard SQP solvers contain elements to enforce global convergence that can interfere … Read more