Convex optimization under combinatorial sparsity constraints

We present a heuristic approach for convex optimization problems containing sparsity constraints. The latter can be cardinality constraints, but our approach also covers more complex constraints on the support of the solution. For the special case that the support is required to belong to a matroid, we propose an exchange heuristic adapting the support in … Read more

Entropic proximal operators for nonnegative trigonometric polynomials

Signal processing applications of semidefinite optimization are often rooted in sum-of-squares representations of nonnegative trigonometric polynomials. Interior-point solvers for semidefinite optimization can handle constraints of this form with a per-iteration-complexity that is cubic in the degree of the trigonometric polynomial. The purpose of this paper is to discuss first-order methods with a lower complexity per … Read more

Hadamard Directional Diff erentiability of the Optimal Value of a Linear Second-order Conic Programming Problem

In this paper, we consider perturbation properties of a linear second-order conic optimization problem and its Lagrange dual in which all parameters in the problem are perturbed. We prove the upper semi-continuity of solution mappings for the primal problem and the Lagrange dual problem. We demonstrate that the optimal value function can be expressed as … Read more

On the Complexity of Testing Attainment of the Optimal Value in Nonlinear Optimization

We prove that unless P=NP, there exists no polynomial time (or even pseudo-polynomial time) algorithm that can test whether the optimal value of a nonlinear optimization problem where the objective and constraints are given by low-degree polynomials is attained. If the degrees of these polynomials are fixed, our results along with previously-known “Frank-Wolfe type” theorems … Read more

Robust Principal Component Analysis using Facial Reduction

We study algorithms for robust principal component analysis (RPCA) for a partially observed data matrix. The aim is to recover the data matrix as a sum of a low-rank matrix and a sparse matrix so as to eliminate erratic noise (outliers). This problem is known to be NP-hard in general. A classical way to solve … Read more

Computational performance of a projection and rescaling algorithm

This paper documents a computational implementation of a {\em projection and rescaling algorithm} for finding most interior solutions to the pair of feasibility problems find $x\in L\cap\mathbb{R}^n_{+} $ and find $x\in L^\perp\cap\mathbb{R}^n_{+},$ where $L$ denotes a linear subspace in $\mathbb{R}^n$ and $L^\perp$ denotes its orthogonal complement. The projection and rescaling algorithm is a recently developed … Read more

A Globally Asymptotically Stable Polynomial Vector Field with Rational Coefficients and no Local Polynomial Lyapunov Function

We give an explicit example of a two-dimensional polynomial vector field of degree seven that has rational coefficients, is globally asymptotically stable, but does not admit an analytic Lyapunov function even locally. Citation Submitted for publication Article Download View A Globally Asymptotically Stable Polynomial Vector Field with Rational Coefficients and no Local Polynomial Lyapunov Function

Axial symmetry indices for convex cones: axiomatic formalism and applications

We address the issue of measuring the degree of axial symmetry of a convex cone. By following an axiomatic approach, we introduce and explore the concept of axial symmetry index. This concept is illustrated with the help of several interesting examples. By way of application, we establish a conic version of the Blekherman inequality concerning … Read more

Optimality conditions and global convergence for nonlinear semidefinite programming

Sequential optimality conditions have played a major role in unifying and extending global convergence results for several classes of algorithms for general nonlinear optimization. In this paper, we extend theses concepts for nonlinear semidefinite programming. We define two sequential optimality conditions for nonlinear semidefinite programming. The first is a natural extension of the so-called Approximate-Karush-Kuhn-Tucker … Read more

SOS-Convex Lyapunov Functions and Stability of Difference Inclusions

We introduce the concept of sos-convex Lyapunov functions for stability analysis of both linear and nonlinear difference inclusions (also known as discrete-time switched systems). These are polynomial Lyapunov functions that have an algebraic certificate of convexity and that can be efficiently found via semidefinite programming. We prove that sos-convex Lyapunov functions are universal (i.e., necessary … Read more