A Cutting Plane Algorithm for Large Scale Semidefinite Relaxations

The recent spectral bundle method allows to compute, within reasonable time, approximate dual solutions of large scale semidefinite quadratic 0-1 programming relaxations. We show that it also generates a sequence of primal approximations that converge to a primal optimal solution. Separating with respect to these approximations gives rise to a cutting plane algorithm that converges … Read more

Strong semismoothness of eigenvalues of symmetric matrices and its application to inverse eigenvalue problems

It is well known that the eigenvalues of a real symmetric matrix are not everywhere differentiable. A classical result of Ky Fan states that each eigenvalue of a symmetric matrix is the difference of two convex functions. This directly implies that the eigenvalues of a symmetric matrix are semismooth everywhere. Based on a very recent … Read more

Polynomial interior point cutting plane methods

Polynomial cutting plane methods based on the logarithmic barrier function and on the volumetric center are surveyed. These algorithms construct a linear programming relaxation of the feasible region, find an appropriate approximate center of the region, and call a separation oracle at this approximate center to determine whether additional constraints should be added to the … Read more

Variational Analysis of Non-Lipschitz Spectral Functions

We consider spectral functions $f \circ \lambda$, where $f$ is any permutation-invariant mapping from $\Cx^n$ to $\Rl$, and $\lambda$ is the eigenvalue map from the set of $n \times n$ complex matrices to $\Cx^n$, ordering the eigenvalues lexicographically. For example, if $f$ is the function “maximum real part CitationMath. Programming 90 (2001), pp. 317-352

Variational Analysis of the Abscissa Mapping for Polynomials

The abscissa mapping on the affine variety $M_n$ of monic polynomials of degree $n$ is the mapping that takes a monic polynomial to the maximum of the real parts of its roots. This mapping plays a central role in the stability theory of matrices and dynamical systems. It is well known that the abscissa mapping … Read more

Optimizing Matrix Stability

Given an affine subspace of square matrices, we consider the problem of minimizing the spectral abscissa (the largest real part of an eigenvalue). We give an example whose optimal solution has Jordan form consisting of a single Jordan block, and we show, using nonlipschitz variational analysis, that this behaviour persists under arbitrary small perturbations to … Read more

Approximating Subdifferentials by Random Sampling of Gradients

Many interesting real functions on Euclidean space are differentiable almost everywhere. All Lipschitz functions have this property, but so, for example, does the spectral abscissa of a matrix (a non-Lipschitz function). In practice, the gradient is often easy to compute. We investigate to what extent we can approximate the Clarke subdifferential of such a function … Read more

Optimal Stability and Eigenvalue Multiplicity

We consider the problem of minimizing over an affine set of square matrices the maximum of the real parts of the eigenvalues. Such problems are prototypical in robust control and stability analysis. Under nondegeneracy conditions, we show that the multiplicities of the active eigenvalues at a critical matrix remain unchanged under small perturbations of the … Read more

A Pattern Search Filter Method for Nonlinear Programming without Derivatives

This paper presents and analyzes a pattern search method for general constrained optimization based on filter methods for step acceptance. Roughly, a filter method accepts a step that either improves the objective function value or the value of some function that measures the constraint violation. The new algorithm does not compute or approximate any derivatives, … Read more