Range of the displacement operator of PDHG with applications to quadratic and conic programming

Primal-dual hybrid gradient (PDHG) is a first-order method for saddle-point problems and convex programming introduced by Chambolle and Pock. Recently, Applegate et al. analyzed the behavior of PDHG when applied to an infeasible or unbounded instance of linear programming, and in particular, showed that PDHG is able to diagnose these conditions. Their analysis hinges on … Read more

Polyhedral Properties of RLT Relaxations of Nonconvex Quadratic Programs and Their Implications on Exact Relaxations

We study linear programming relaxations of nonconvex quadratic programs given by the reformulation-linearization technique (RLT), referred to as RLT relaxations. We investigate the relations between the polyhedral properties of the feasible regions of a quadratic program and its RLT relaxation. We establish various connections between recession directions, boundedness, and vertices of the two feasible regions. … Read more

Enhancements of Discretization Approaches for Non-Convex Mixed-Integer Quadratically Constraint Quadratic Programming: Part II

Abstract. This is Part II of a study on mixed-integer programming (MIP) relaxation techniques for the solution of non-convex mixed-integer quadratically constrained quadratic programs (MIQCQPs). We set the focus on MIP relaxation methods for non-convex continuous variable products and extend the well-known MIP relaxation normalized multiparametric disaggregation technique (NMDT), applying a sophisticated discretization to both … Read more

Enhancements of Discretization Approaches for Non-Convex Mixed-Integer Quadratically Constraint Quadratic Programming: Part I

We study mixed-integer programming (MIP) relaxation techniques for the solution of non-convex mixed-integer quadratically constrained quadratic programs (MIQCQPs). We present MIP relaxation methods for non-convex continuous variable products. In Part I, we consider MIP relaxations based on separable reformulation. The main focus is the introduction of the enhanced separable MIP relaxation for non-convex quadratic products … Read more

On the convergence of iterative schemes for solving a piecewise linear system of equations

This paper is devoted to studying the global and finite convergence of the semi-smooth Newton method for solving a piecewise linear system that arises in cone-constrained quadratic programming problems and absolute value equations. We first provide a negative answer via a counterexample to a conjecture on the global and finite convergence of the Newton iteration … Read more

Log-domain interior-point methods for convex quadratic programming

Applying an interior-point method to the central-path conditions is a widely used approach for solving quadratic programs. Reformulating these conditions in the log-domain is a natural variation on this approach that to our knowledge is previously unstudied. In this paper, we analyze log-domain interior-point methods and prove their polynomial-time convergence. We also prove that they … Read more

Generating Cutting Inequalities Successively for Quadratic Optimization Problems in Binary Variables

We propose a successive generation of cutting inequalities for binary quadratic optimization problems. Multiple cutting inequalities are successively generated for the convex hull of the set of the optimal solutions $\subset \{0, 1\}^n$, while the standard cutting inequalities are used for the convex hull of the feasible region. An arbitrary linear inequality with integer coefficients … Read more

A new stopping criterion for Krylov solvers applied in Interior Point Methods

A surprising result is presented in this paper with possible far reaching consequences for any optimization technique which relies on Krylov subspace methods employed to solve the underlying linear equation systems. In this paper the advantages of the new technique are illustrated in the context of Interior Point Methods (IPMs). When an iterative method is … Read more

Supermodularity and valid inequalities for quadratic optimization with indicators

We study the minimization of a rank-one quadratic with indicators and show that the underlying set function obtained by projecting out the continuous variables is supermodular. Although supermodular minimization is, in general, difficult, the specific set function for the rank-one quadratic can be minimized in linear time. We show that the convex hull of the … Read more

A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization

We introduce a family of weighted conjugate-gradient-type methods, for strictly convex quadratic functions, whose parameters are determined by a minimization model based on a convex combination of the objective function and its gradient norm. This family includes the classical linear conjugate gradient method and the recently published delayed weighted gradient method as the extreme cases … Read more