Improving the linear relaxation of maximum hBccut with semidefinite-based constraints

We consider the maximum $k$-cut problem that involves partitioning the vertex set of a graph into $k$ subsets such that the sum of the weights of the edges joining vertices in different subsets is maximized. The associated semidefinite programming (SDP) relaxation is known to provide strong bounds, but it has a high computational cost. We … Read more

Using Nemirovski’s Mirror-Prox method as Basic Procedure in Chubanov’s method for solving homogeneous feasibility problems

We introduce a new variant of Chubanov’s method for solving linear homogeneous systems with positive variables. In the \BP\ we use a recently introduced cut in combination with Nemirovski’s Mirror-Prox method. We show that the cut requires at most $O(n^3)$ time, just as Chabonov’s cut. In an earlier paper it was shown that the new … Read more

BBCPOP: A Sparse Doubly Nonnegative Relaxation of Polynomial Optimization Problems with Binary, Box and Complementarity Constraints

The software package BBCPOP is a MATLAB implementation of a hierarchy of sparse doubly nonnegative (DNN) relaxations of a class of polynomial optimization (minimization) problems (POPs) with binary, box and complementarity (BBC) constraints. Given a POP in the class and a relaxation order, BBCPOP constructs a simple conic optimization problem (COP), which serves as a … Read more

User Manual for BBCPOP: A Sparse Doubly Nonnegative Relaxation of Polynomial Optimization Problems with Binary, Box and Complementarity Constraints

BBCPOP proposed in [4] is a MATLAB implementation of a hierarchy of sparse doubly nonnegative (DNN) relaxations of a class of polynomial optimization (minimization) problems (POPs) with binary, box and complementarity constraints. Given a POP in the class and a relaxation order (or a hierarchy level), BBCPOP constructs a simple conic optimization problem (COP), which … Read more

Convex optimization under combinatorial sparsity constraints

We present a heuristic approach for convex optimization problems containing sparsity constraints. The latter can be cardinality constraints, but our approach also covers more complex constraints on the support of the solution. For the special case that the support is required to belong to a matroid, we propose an exchange heuristic adapting the support in … Read more

Entropic proximal operators for nonnegative trigonometric polynomials

Signal processing applications of semidefinite optimization are often rooted in sum-of-squares representations of nonnegative trigonometric polynomials. Interior-point solvers for semidefinite optimization can handle constraints of this form with a per-iteration-complexity that is cubic in the degree of the trigonometric polynomial. The purpose of this paper is to discuss first-order methods with a lower complexity per … Read more

Hadamard Directional Diff erentiability of the Optimal Value of a Linear Second-order Conic Programming Problem

In this paper, we consider perturbation properties of a linear second-order conic optimization problem and its Lagrange dual in which all parameters in the problem are perturbed. We prove the upper semi-continuity of solution mappings for the primal problem and the Lagrange dual problem. We demonstrate that the optimal value function can be expressed as … Read more

On the Complexity of Testing Attainment of the Optimal Value in Nonlinear Optimization

We prove that unless P=NP, there exists no polynomial time (or even pseudo-polynomial time) algorithm that can test whether the optimal value of a nonlinear optimization problem where the objective and constraints are given by low-degree polynomials is attained. If the degrees of these polynomials are fixed, our results along with previously-known “Frank-Wolfe type” theorems … Read more

Robust Principal Component Analysis using Facial Reduction

We study algorithms for robust principal component analysis (RPCA) for a partially observed data matrix. The aim is to recover the data matrix as a sum of a low-rank matrix and a sparse matrix so as to eliminate erratic noise (outliers). This problem is known to be NP-hard in general. A classical way to solve … Read more

Computational performance of a projection and rescaling algorithm

This paper documents a computational implementation of a {\em projection and rescaling algorithm} for finding most interior solutions to the pair of feasibility problems find $x\in L\cap\mathbb{R}^n_{+} $ and find $x\in L^\perp\cap\mathbb{R}^n_{+},$ where $L$ denotes a linear subspace in $\mathbb{R}^n$ and $L^\perp$ denotes its orthogonal complement. The projection and rescaling algorithm is a recently developed … Read more