Motivated by modern regression applications, in this paper, we study the convexification of quadratic optimization problems with indicator variables and combinatorial constraints on the indicators. Unlike most of the previous work on convexification of sparse regression problems, we simultaneously consider the nonlinear objective, indicator variables, and combinatorial constraints. We prove that for a separable quadratic objective function, the perspective reformulation is ideal independent from the constraints of the problem. In contrast, while rank-one relaxations cannot be strengthened by exploiting information from k-sparsity constraint for k greater than one, they can be improved for other constraints arising in inference problems with hierarchical structure or multi-collinearity.
Citation
Research report AG 20.01, ISE, University of Southern California, January 2020