A Parametric Approach for Solving Convex Quadratic Optimization with Indicators Over Trees

This paper investigates convex quadratic optimization problems involving $n$ indicator variables, each associated with a continuous variable, particularly focusing on scenarios where the matrix $Q$ defining the quadratic term is positive definite and its sparsity pattern corresponds to the adjacency matrix of a tree graph. We introduce a graph-based dynamic programming algorithm that solves this … Read more

Polyhedral Analysis of Quadratic Optimization Problems with Stieltjes Matrices and Indicators

In this paper, we consider convex quadratic optimization problems with indicators on the continuous variables. In particular, we assume that the Hessian of the quadratic term is a Stieltjes matrix, which naturally appears in sparse graphical inference problems and others. We describe an explicit convex formulation for the problem by studying the Stieltjes polyhedron arising … Read more

Sparse Polynomial Optimization with Unbounded Sets

This paper considers sparse polynomial optimization with unbounded sets. When the problem possesses correlative sparsity, we propose a sparse homogenized Moment-SOS hierarchy with perturbations to solve it. The new hierarchy introduces one extra auxiliary variable for each variable clique according to the correlative sparsity pattern. Under the running intersection property, we prove that this hierarchy … Read more

On Tractable Convex Relaxations of Standard Quadratic Optimization Problems under Sparsity Constraints

Standard quadratic optimization problems (StQPs) provide a versatile modelling tool in various applications. In this paper, we consider StQPs with a hard sparsity constraint, referred to as sparse StQPs. We focus on various tractable convex relaxations of sparse StQPs arising from a mixed-binary quadratic formulation, namely, the linear optimization relaxation given by the reformulation-linearization technique, … Read more

Compressed Sensing: A Discrete Optimization Approach

We study the Compressed Sensing (CS) problem, which is the problem of finding the most sparse vector that satisfies a set of linear measurements up to some numerical tolerance. CS is a central problem in Statistics, Operations Research and Machine Learning which arises in applications such as signal processing, data compression and image reconstruction. We … Read more

First-Order Methods for Nonsmooth Nonconvex Functional Constrained Optimization with or without Slater Points

Constrained optimization problems where both the objective and constraints may be nonsmooth and nonconvex arise across many learning and data science settings. In this paper, we show a simple first-order method finds a feasible, ϵ-stationary point at a convergence rate of O(ϵ−4) without relying on compactness or Constraint Qualification (CQ). When CQ holds, this convergence is measured by … Read more

An SDP Relaxation for the Sparse Integer Least Square Problem

In this paper, we study the polynomial approximability or solvability of sparse integer least square problem (SILS), which is the NP-hard variant of the least square problem, where we only consider sparse {0, ±1}-vectors. We propose an l1-based SDP relaxation to SILS, and introduce a randomized algorithm for SILS based on the SDP relaxation. In … Read more

Sparse multi-term disjunctive cuts for the epigraph of a function of binary variables

We propose a new method for separating valid inequalities for the epigraph of a function of binary variables. The proposed inequalities are disjunctive cuts defined by disjunctive terms obtained by enumerating a subset $I$ of the binary variables. We show that by restricting the support of the cut to the same set of variables $I$, … Read more

A Graph-based Decomposition Method for Convex Quadratic Optimization with Indicators

In this paper, we consider convex quadratic optimization problems with indicator variables when the matrix Q defining the quadratic term in the objective is sparse. We use a graphical representation of the support of Q, and show that if this graph is a path, then we can solve the associated problem in polynomial time. This … Read more

Sparse Plus Low Rank Matrix Decomposition: A Discrete Optimization Approach

We study the Sparse Plus Low-Rank decomposition problem (SLR), which is the problem of decomposing a corrupted data matrix into a sparse matrix of perturbations plus a low-rank matrix containing the ground truth. SLR is a fundamental problem in Operations Research and Machine Learning which arises in various applications, including data compression, latent semantic indexing, … Read more