Sparse Regression at Scale: Branch-and-Bound rooted in First-Order Optimization

We consider the least squares regression problem, penalized with a combination of the L0 and L2 norms (a.k.a. L0 L2 regularization). Recent work presents strong evidence that the resulting L0-based estimators can outperform popular sparse learning methods, under many important high-dimensional settings. However, exact computation of L0-based estimators remains a major challenge. Indeed, state-of-the-art mixed … Read more

A new interior-point approach for large two-stage stochastic problems

Two-stage stochastic models give rise to very large optimization problems. Several approaches have been devised for efficiently solving them, including interior-point methods (IPMs). However, using IPMs, the linking columns associated to first-stage decisions cause excessive fill-in for the solution of the normal equations. This downside is usually alleviated if variable splitting is applied to first-stage … Read more

Shape-Constrained Regression using Sum of Squares Polynomials

We consider the problem of fitting a polynomial function to a set of data points, each data point consisting of a feature vector and a response variable. In contrast to standard polynomial regression, we require that the polynomial regressor satisfy shape constraints, such as monotonicity, Lipschitz-continuity, or convexity. We show how to use semidefinite programming … Read more

On the symmetry of induced norm cones

Several authors have studied the problem of making an asymmetric cone symmetric through a change of inner product, and one set of positive results pertains to the class of elliptic cones. We demonstrate that the class of elliptic cones is equal to the class of induced-norm cones that arise through Jordan-isomorphism with the second-order cone, … Read more

Bregman primal–dual first-order method and application to sparse semidefinite programming

We present a new variant of the Chambolle–Pock primal–dual method with Bregman distances, analyze its convergence, and apply it to the centering problem in sparse semidefinite programming. The novelty in the method is a line search procedure for selecting suitable step sizes. The line search obviates the need for estimating the norm of the constraint … Read more

A note on the Lasserre hierarchy for different formulations of the maximum independent set problem

In this note, we consider several polynomial optimization formulations of the max- imum independent set problem and the use of the Lasserre hierarchy with these different formulations. We demonstrate using computational experiments that the choice of formulation may have a significant impact on the resulting bounds. We also provide theoretical justifications for the observed behavior. … Read more

Projection and rescaling algorithm for finding most interior solutions to polyhedral conic systems

We propose a simple projection and rescaling algorithm that finds {\em most interior} solutions to the pair of feasibility problems \[ \text{find} x\in L\cap \R^n_{+} \text{ and } \text{find} \; \hat x\in L^\perp\cap\R^n_{+}, \] where $L$ is a linear subspace of $\R^n$ and $L^\perp$ is its orthogonal complement. The algorithm complements a basic procedure that … Read more

A Combinatorial Cut-and-Lift Procedure with an Application to 0-1 Chance Constraints

Cut generation and lifting are key components for the performance of state-of-the-art mathematical programming solvers. This work proposes a new general cut-and-lift procedure that exploits the combinatorial structure of 0-1 problems via a binary decision diagram (BDD) encoding of their constraints. We present a general framework that can be applied to a large range of … Read more

On optimality conditions for nonlinear conic programming

Sequential optimality conditions have played a major role in proving stronger global convergence results of numerical algorithms for nonlinear programming. Several extensions have been described in conic contexts, where many open questions have arisen. In this paper, we present new sequential optimality conditions in the context of a general nonlinear conic framework, which explains and … Read more

Near-optimal analysis of univariate moment bounds for polynomial optimization

We consider a recent hierarchy of upper approximations proposed by Lasserre (arXiv:1907.097784, 2019) for the minimization of a polynomial f over a compact set K⊆ℝn. This hierarchy relies on using the push-forward measure of the Lebesgue measure on K by the polynomial f and involves univariate sums of squares of polynomials with growing degrees 2r. … Read more