Sparse PSD approximation of the PSD cone

While semidefinite programming (SDP) problems are polynomially solvable in theory, it is often difficult to solve large SDP instances in practice. One technique to address this issue is to relax the global positive-semidefiniteness (PSD) constraint and only enforce PSD-ness on smaller k times k principal submatrices — we call this the sparse SDP relaxation. Surprisingly, it has been observed empirically that in some cases this approach appears to produce bounds that are close to the optimal objective function value of the original SDP. In this paper, we formally attempt to compare the strength of the sparse SDP relaxation vis-`a-vis the original SDP from a theoretical perspective. In order to simplify the question, we arrive at a data independent version of it, where we compare the sizes of SDP cone and the k-PSD closure, which is the cone of matrices where PSD-ness is enforced on all k times k principal submatrices. In particular, we investigate the question of how far a matrix of unit Frobenius norm in the k-PSD closure can be from the SDP cone. We provide two incomparable upper bounds on this farthest distance as a function of k and n. We also provide matching lower bounds, which show that the upper bounds are tight within a constant in different regimes of k and n. Other than linear algebra techniques, we extensively use probabilistic methods to arrive at these bounds. One of the lower bounds is obtained by observing a connection between matrices in the k-PSD closure and matrices satisfying the restricted isometry property (RIP).

Article

Download

View PDF