Exploiting Sparsity in SDP Relaxation for Sensor Network Localization

A sensor network localization problem can be formulated as a quadratic optimization problem (QOP). For quadratic optimization problems, semidefinite programming (SDP) relaxation by Lasserre with relaxation order 1 for general polynomial optimization problems (POPs) is known to be equivalent to the sparse SDP relaxation by Waki {¥it et al.} with relaxation order 1, except the … Read more

A Numerical Algorithm for Block-Diagonal Decomposition of Matrix *-Algebras, Part I: Proposed Approach and Application to Semidefinite Programming

Motivated by recent interest in group-symmetry in semidefinite programming, we propose a numerical method for finding a finest simultaneous block-diagonalization of a finite number of matrices, or equivalently the irreducible decomposition of the generated matrix *-algebra. The method is composed of numerical-linear algebraic computations such as eigenvalue computation, and automatically makes full use of the … Read more

Correlative sparsity in primal-dual interior-point methods for LP, SDP and SOCP

Exploiting sparsity has been a key issue in solving large-scale optimization problems. The most time-consuming part of primal-dual interior-point methods for linear programs, second-order cone programs, and semidefinite programs is solving the Schur complement equation at each iteration, usually by the Cholesky factorization. The computational efficiency is greatly affected by the sparsity of the coefficient … Read more

Efficient Evaluation of Polynomials and Their Partial Derivatives in Homotopy Continuation Methods

The aim of this paper is to study how efficiently we evaluate a system of multivariate polynomials and their partial derivatives in homotopy continuation methods. Our major tool is an extension of the Hornor scheme, which is popular in evaluating a univariate polynomial, to a multivariate polynomial. But the extension is not unique, and there … Read more

Recognizing Underlying Sparsity in Optimization

Exploiting sparsity is essential to improve the efficiency of solving large optimization problems. We present a method for recognizing the underlying sparsity structure of a nonlinear partially separable problem, and show how the sparsity of the Hessian matrices of the problem’s functions can be improved by performing a nonsingular linear transformation in the space corresponding … Read more

Dynamic Enumeration of All Mixed Cells

The polyhedral homotopy method, which has been known as a powerful numerical method for computing all isolated zeros of a polynomial system, requires all mixed cells of the support of the system to construct a family of homotopy functions. Finding the mixed cells is formulated in terms of a linear inequality system with an additional … Read more

A Note on Sparse SOS and SDP Relaxations for Polynomial Optimization Problems over Symmetric Cones

This short note extends the sparse SOS (sum of squares) and SDP (semidefinite programming) relaxation proposed by Waki, Kim, Kojima and Muramatsu for normal POPs (polynomial optimization problems) to POPs over symmetric cones, and establishes its theoretical convergence based on the recent convergence result by Lasserre on the sparse SOS and SDP relaxation for normal … Read more

Parallel Primal-Dual Interior-Point Methods for SemiDefinite Programs

The Semidefinite Program (SDP) is a fundamental problem in mathematical programming. It covers a wide range of applications, such as combinatorial optimization, control theory, polynomial optimization, and quantum chemistry. Solving extremely large-scale SDPs which could not be solved before is a significant work to open up a new vista of future applications of SDPs. Our … Read more

SparsePOP : a Sparse Semidefinite Programming Relaxation of Polynomial Optimization Problems

SparesPOP is a MATLAB implementation of a sparse semidefinite programming (SDP) relaxation method proposed for polynomial optimization problems (POPs) in the recent paper by Waki et al. The sparse SDP relaxation is based on a hierarchy of LMI relaxations of increasing dimensions by Lasserre, and exploits a sparsity structure of polynomials in POPs. The efficiency … Read more

Sums of Squares and Semidefinite Programming Relaxations for Polynomial Optimization Problems with Structured Sparsity

Unconstrained and inequality constrained sparse polynomial optimization problems (POPs) are considered. A correlative sparsity pattern graph is defined to find a certain sparse structure in the objective and constraint polynomials of a POP. Based on this graph, sets of supports for sums of squares (SOS) polynomials that lead to efficient SOS and semidefinite programming (SDP) … Read more