A Proximal Method for Identifying Active Manifolds

The minimization of an objective function over a constraint set can often be simplified if the “active manifold” of the constraints set can be correctly identified. In this work we present a simple subproblem, which can be used inside of any (convergent) optimization algorithm, that will identify the active manifold of a “prox-regular partly smooth” … Read more

Cubic regularization of Newton’s method for convex problems with constraints

In this paper we derive the efficiency estimates of the regularized Newton’s method as applied to constrained convex minimization problems and to variational inequalities. We study a one-step Newton’s method and its multistep accelerated version, which converges on smooth convex problems as $O({1 \over k^3})$, where $k$ is the iteration counter. We derive also the … Read more

A local convergence property of primal-dual methods for nonlinear programming

We prove a new local convergence property of a primal-dual method for solving nonlinear optimization problem. Following a standard interior point approach, the complementarity conditions of the original primal-dual system are perturbed by a parameter which is driven to zero during the iterations. The sequence of iterates is generated by a linearization of the perturbed … Read more

A Note on Sparse SOS and SDP Relaxations for Polynomial Optimization Problems over Symmetric Cones

This short note extends the sparse SOS (sum of squares) and SDP (semidefinite programming) relaxation proposed by Waki, Kim, Kojima and Muramatsu for normal POPs (polynomial optimization problems) to POPs over symmetric cones, and establishes its theoretical convergence based on the recent convergence result by Lasserre on the sparse SOS and SDP relaxation for normal … Read more

Interior-Point Methods for Nonconvex Nonlinear Programming: Regularization and Warmstarts

In this paper, we investigate the use of an exact primal-dual penalty approach within the framework of an interior-point method for nonconvex nonlinear programming. This approach provides regularization and relaxation, which can aid in solving ill-behaved problems and in warmstarting the algorithm. We present details of our implementation within the LOQO algorithm and provide extensive … Read more

On warm starts for interior methods

An appealing feature of interior methods for linear programming is that the number of iterations required to solve a problem tends to be relatively insensitive to the choice of initial point. This feature has the drawback that it is difficult to design interior methods that efficiently utilize information from an optimal solution to a “nearby” … Read more

Dynamic updates of the barrier parameter in primal-dual methods for nonlinear programming

We introduce a framework in which updating rules for the barrier parameter in primal-dual interior-point methods become dynamic. The original primal-dual system is augmented to incorporate explicitly an updating function. A Newton step for the augmented system gives a primal-dual Newton step and also a step in the barrier parameter. Based on local information and … Read more

SOS approximation of polynomials nonnegative on a real algebraic set

Let $V\subset R^n$ be a real algebraic set described by finitely many polynomials equations $g_j(x)=0,j\in J$, and let $f$ be a real polynomial, nonnegative on $V$. We show that for every $\epsilon>0$, there exist nonnegative scalars $\{\lambda_j\}_{j\in J}$ such that, for all $r$ sufficiently large, $f+\epsilon\theta_r+\sum_{j\in J} \lambda_j g_j^2$ is a sum of squares. Here, … Read more

A Note on Multiobjective Optimization and Complementarity Constraints

We propose a new approach to convex nonlinear multiobjective optimization that captures the geometry of the Pareto set by generating a discrete set of Pareto points optimally. We show that the problem of finding an optimal representation of the Pareto surface can be formulated as a mathematical program with complementarity constraints. The complementarity constraints arise … Read more

Variational Two-electron Reduced Density Matrix Theory for Many-electron Atoms and Molecules: Implementation of the Spin- and Symmetry-adapted T2 Condition through First-order Semidefinite Programming

The energy and properties of a many-electron atom or molecule may be directly computed from a variational optimization of a two-electron reduced density matrix (2-RDM) that is constrained to represent many-electron quantum systems. In this paper we implement a variational 2-RDM method with a representability constraint, known as the $T_2$ condition. The optimization of the … Read more