Hybrid LP/SDP Bounding Procedure

The principal idea of this paper is to exploit Semidefinite Programming (SDP) relaxation within the framework provided by Mixed Integer Nonlinear Programming (MINLP) solvers when tackling Binary Quadratic Problems (BQP). SDP relaxation is well-known to provide strong bounds for BQP in practice. However, the method is not typically implemented in many state-of-the-art MINLP solvers based … Read more

Simplified Copositive and Lagrangian Relaxations for Linearly Constrained Quadratic Optimization Problems in Continuous and Binary Variables

For a quadratic optimization problem (QOP) with linear equality constraints in continuous nonnegative variables and binary variables, we propose three relaxations in simplified forms with a parameter $\lambda$: Lagrangian, completely positive, and copositive relaxations. These relaxations are obtained by reducing the QOP to an equivalent QOP with a single quadratic equality constraint in nonnegative variables, … Read more

Lowest-rank Solutions of Continuous and Discrete Lyapunov Equations over Symmetric Cone

The low-rank solutions of continuous and discrete Lyapunov equations are of great importance but generally difficult to achieve in control system analysis and design. Fortunately, Mesbahi and Papavassilopoulos [On the rank minimization problems over a positive semidefinite linear matrix inequality, IEEE Trans. Auto. Control, Vol. 42, No. 2 (1997), 239-243] showed that with the semidefinite … Read more

The Spectral Bundle Method with Second-Order Information

The spectral bundle method was introduced by Helmberg and Rendl to solve a class of eigenvalue optimization problems that is equivalent to the class of semidefinite programs with the constant trace property. We investigate the feasibility and effectiveness of including full or partial second-order information in the spectral bundle method, building on work of Overton … Read more

Improving an interior-point approach for large block-angular problems by hybrid preconditioners

The computational time required by interior-point methods is often dominated by the solution of linear systems of equations. An efficient specialized interior-point algorithm for primal block-angular problems has been used to solve these systems by combining Cholesky factorizations for the block constraints and a conjugate gradient based on a power series preconditioner for the linking … Read more

Compressed Sensing Off the Grid

We consider the problem of estimating the frequency components of a mixture of s complex sinusoids from a random subset of n regularly spaced samples. Unlike previous work in compressed sensing, the frequencies are not assumed to lie on a grid, but can assume any values in the normalized frequency domain [0, 1]. We propose … Read more

A Quadratically Constrained Quadratic Optimization Model for Completely Positive Cone Programming

We propose a class of quadratic optimization problems whose exact optimal objective values can be computed by their completely positive cone programming relaxations. The objective function can be any quadratic form. The constraints of each problem are described in terms of quadratic forms with no linear terms, and all constraints are homogeneous equalities, except one … Read more

Primal-dual subgradient method for Huge-Scale Linear Conic Problems

In this paper we develop a {\em primal-dual} subgradient method for solving huge-scale Linear Conic Optimization Problems. Our main assumption is that the primal cone is formed as a direct product of many small-dimensional convex cones, and that the matrix $A$ of corresponding linear operator is {\em uniformly sparse}. In this case, our method can … Read more

Hankel Matrix Rank Minimization with Applications to System Identification and Realization

We introduce a flexible optimization framework for nuclear norm minimization of matrices with linear structure, including Hankel, Toeplitz and moment structures, and catalog applications from diverse fields under this framework. We discuss various first-order methods for solving the resulting optimization problem, including alternating direction methods of multipliers, proximal point algorithms and gradient projection methods. We … Read more

Convergence Analysis of an Inexact Feasible Interior Point Method for Convex Quadratic Programming

In this paper we will discuss two variants of an inexact feasible interior point algorithm for convex quadratic programming. We will consider two different neighbourhoods: a (small) one induced by the use of the Euclidean norm which yields a short-step algorithm and a symmetric one induced by the use of the infinity norm which yields … Read more