A New Relaxation Framework for Quadratic Assignment Problems based on Matrix Splitting

Quadratic assignment problems (QAPs) are among the hardest discrete optimization problems. Recent study shows that even obtaining a strong lower bound for QAPs is a computational challenge. In this paper, we first discuss how to construct new simple convex relaxations of QAPs based on various matrix splitting schemes. Then we introduce the so-called symmetric mappings … Read more

Estimating Bounds for Quadratic Assignment Problems Associated with Hamming and Manhattan Distance Matrices based on Semidefinite Programming

Quadratic assignment problems (QAPs) with a Hamming distance matrix of a hypercube or a Manhattan distance matrix of rectangular grids arise frequently from communications and facility locations and are known to be among the hardest discrete optimization problems. In this paper we consider the issue of how to obtain lower bounds for those two classes … Read more

Graph Modeling for Quadratic Assignment Problem Associated with the Hypercube

In the paper we consider the quadratic ssignment problem arising from channel coding in communications where one coefficient matrix is the adjacency matrix of a hypercube in a finite dimensional space. By using the geometric structure of the hypercube, we first show that there exist at least $n$ different optimal solutions to the underlying QAPs. … Read more

Approximating K-means-type clustering via semidefinite programming

One of the fundamental clustering problems is to assign $n$ points into $k$ clusters based on the minimal sum-of-squares(MSSC), which is known to be NP-hard. In this paper, by using matrix arguments, we first model MSSC as a so-called 0-1 semidefinite programming (SDP). We show that our 0-1 SDP model provides an unified framework for … Read more

On Mehrotra-Type Predictor-Corrector Algorithms

In this paper we discuss the polynomiality of Mehrotra-type predictor-corrector algorithms. We consider a variant of the original prototype of the algorithm that has been widely used in several IPM based optimization packages, for which no complexity result is known to date. By an example we show that in this variant the usual Mehrotra-type adaptive … Read more

On Implementing Self-Regular Proximity Based Feasible IPMs

Self-regular based interior point methods present a unified novel approach for solving linear optimization and conic optimization problems. So far it was not known if the new Self-Regular IPMs can lead to similar advances in computational practice as shown in the theoretical analysis. In this paper, we present our experiences in developing the software package … Read more

A predictor-corrector algorithm for linear optimization based on a specific self-regular proximity function

It is well known that the so-called first-order predictor-corrector methods working in a large neighborhood of the central path are among the most efficient interior-point methods (IPMs) for linear optimization (LO) problems. However, the best known iteration complexity of this type of methods is $O\br{n \log\frac{(x^0)^Ts^0}{\varepsilon}}$. It is of interests to investigate whether the complexity … Read more

A Dynamic Large-Update Primal-Dual Interior-Point Method for Linear Optimization

Primal-dual interior-point methods (IPMs) have shown their power in solving large classes of optimization problems. However, at present there is still a gap between the practical behavior of these algorithms and their theoretical worst-case complexity results, with respect to the strategies of updating the duality gap parameter in the algorithm. The so-called small-update IPMs enjoy … Read more

A New and Efficient Large-Update Interior-Point Method for Linear Optimization

Recently, the authors presented a new large-update primal-dual method for Linear Optimization, whose $O(n^\frac23\,\log\frac{n}{\e})$ iteration bound substantially improved the classical bound for such methods, which is $O\br{n\log\frac{n}{\e}}$. In this paper we present an improved analysis of the new method. The analysis uses some new mathematical tools developed before when we considered a whole family of … Read more