Max-min separability: incremental approach and application to supervised data classification

A new algorithm for the computation of a piecewise linear function separating two finite point sets in $n$-dimensional space is developed and the algorithm is applied to solve supervised data classification problems. The algorithm computes hyperplanes incrementally and it finds as many hyperplanes as necessary to separate two sets with respect to some tolerance. An … Read more

A Proximal Method for Identifying Active Manifolds

The minimization of an objective function over a constraint set can often be simplified if the “active manifold” of the constraints set can be correctly identified. In this work we present a simple subproblem, which can be used inside of any (convergent) optimization algorithm, that will identify the active manifold of a “prox-regular partly smooth” … Read more

Benchmark of Some Nonsmooth Optimization Solvers for Computing Nonconvex Proximal Points

The major focus of this work is to compare several methods for computing the proximal point of a nonconvex function via numerical testing. To do this, we introduce two techniques for randomly generating challenging nonconvex test functions, as well as two very specific test functions which should be of future interest to Nonconvex Optimization Benchmarking. … Read more

Iterative Solution of Augmented Systems Arising in Interior Methods

Iterative methods are proposed for certain augmented systems of linear equations that arise in interior methods for general nonlinear optimization. Interior methods define a sequence of KKT equations that represent the symmetrized (but indefinite) equations associated with Newton’s method for a point satisfying the perturbed optimality conditions. These equations involve both the primal and dual … Read more

Computing Proximal Points on Nonconvex Functions

The proximal point mapping is the basis of many optimization techniques for convex functions. By means of variational analysis, the concept of proximal mapping was recently extended to nonconvex functions that are prox-regular and prox-bounded. In such a setting, the proximal point mapping is locally Lipschitz continuous and its set of fixed points coincide with … Read more

Convergent relaxations of polynomial matrix inequalities and static output feedback

Using a moment interpretation of recent results on sum-of-squares decompositions of non-negative polynomial matrices, we propose a hierarchy of convex linear matrix inequality (LMI) relaxations to solve non-convex polynomial matrix inequality (PMI) optimization problems, including bilinear matrix inequality (BMI) problems. This hierarchy of LMI relaxations generates a monotone sequence of lower bounds that converges to … Read more

Jordan-algebraic aspects of nonconvex optimization over symmetric cones

We illustrate the usefulness of Jordan-algebraic technique for nonconvex optimization by considering a potential-reduction algorithm for a nonconvex quadratic function over the domain obtained as the intersection of a symmetric cone with an affine subspace Citation Preprint, September,2004 Article Download View Jordan-algebraic aspects of nonconvex optimization over symmetric cones

SIAG/Opt Views-and-News Vol 14 No 1

SIAM’s SIAG/Opt Newsletter special issue on Large Scale Nonconvex Optimization. Guest editors Sven Leyffer and Jorge Nocedal, with contributions by Gould, Sachs, Biegler, Waechter, Leyffer, Bussieck and Pruessner. Citation SIAG/Opt Views-and-News, Volume 14 Number 1, April 2003. http://fewcal.uvt.nl/sturm/siagopt/ Article Download View SIAG/Opt Views-and-News Vol 14 No 1

Lagrangian Smoothing Heuristic for Max-Cut

This paper presents smoothing heuristics for an NP-hard combinatorial problem based on Lagrangian relaxation. We formulate the Lagrangian dual for this nonconvex quadratic problem and propose eigenvalue nonsmooth unconstrained optimization to solve the dual problem with bundle or subgradient methods. Derived heuristics are considered to obtain good primal solutions through pathfollowing methods using a projected … Read more

A General Framework for Convex Relaxation of Polynomial Optimization Problems over Cones

The class of POPs (Polynomial Optimization Problems) over cones covers a wide range of optimization problems such as $0$-$1$ integer linear and quadratic programs, nonconvex quadratic programs and bilinear matrix inequalities. This paper presents a new framework for convex relaxation of POPs over cones in terms of linear optimization problems over cones. It provides a … Read more