A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization

\(\) In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint. In particular, we propose a Newton-conjugate gradient (Newton-CG) based barrier-augmented Lagrangian method for finding an approximate SOSP of this problem. … Read more

A Newton-CG based barrier method for finding a second-order stationary point of nonconvex conic optimization with complexity guarantees

In this paper we consider finding an approximate second-order stationary point (SOSP) of nonconvex conic optimization that minimizes a twice differentiable function over the intersection of an affine subspace and a convex cone. In particular, we propose a Newton-conjugate gradient (Newton-CG) based barrier method for finding an $(\epsilon,\sqrt{\epsilon})$-SOSP of this problem. Our method is not … Read more

Steplength Selection in Interior-Point Methods for Quadratic Programming

We present a new strategy for choosing primal and dual steplengths in a primal-dual interior-point algorithm for convex quadratic programming. Current implementations often scale steps equally to avoid increases in dual infeasibility between iterations. We propose that this method can be too conservative, while safeguarding an unequally-scaled steplength approach will often require fewer steps toward … Read more

Dynamic updates of the barrier parameter in primal-dual methods for nonlinear programming

We introduce a framework in which updating rules for the barrier parameter in primal-dual interior-point methods become dynamic. The original primal-dual system is augmented to incorporate explicitly an updating function. A Newton step for the augmented system gives a primal-dual Newton step and also a step in the barrier parameter. Based on local information and … Read more

Computation of Minimum Volume Covering Ellipsoids

We present a practical algorithm for computing the minimum volume n-dimensional ellipsoid that must contain m given points a_1, …, a_m \in R^n. This convex constrained problem arises in a variety of applied computational settings, particularly in data mining and robust statistics. Its structure makes it particularly amenable to solution by interior-point methods, and it … Read more

Complexity of Convex Optimization using Geometry-based Measures and a Reference Point

Our concern lies in solving the following convex optimization problem: minimize cx subject to Ax=b, x \in P, where P is a closed convex set, not necessarily a cone. We bound the complexity of computing an almost-optimal solution of this problem in terms of natural geometry-based measures of the feasible region and the level-set of … Read more

Global and Local Convergence of Line Search Filter Methods for Nonlinear Programming

Line search methods for nonlinear programming using Fletcher and Leyffer’s filter method, which replaces the traditional merit function, are proposed and their global and local convergence properties are analyzed. Previous theoretical work on filter methods has considered trust region algorithms and only the question of global convergence. The presented framework is applied to barrier interior … Read more

Feasible Interior Methods Using Slacks for Nonlinear Optimization

A slack-based feasible interior point method is described which can be derived as a modification of infeasible methods. The modification is minor for most line search methods, but trust region methods require special attention. It is shown how the Cauchy point, which is often computed in trust region methods, must be modified so that the … Read more