Quadratic Optimization with Switching Variables: The Convex Hull for n=2
We consider quadratic optimization in variables (x,y), 0
We consider quadratic optimization in variables (x,y), 0
While semidefinite programming (SDP) problems are polynomially solvable in theory, it is often difficult to solve large SDP instances in practice. One technique to address this issue is to relax the global positive-semidefiniteness (PSD) constraint and only enforce PSD-ness on smaller k times k principal submatrices — we call this the sparse SDP relaxation. Surprisingly, … Read more
Quadratically constrained quadratic programs (QCQPs) are a fundamental class of optimization problems well-known to be NP-hard in general. In this paper we study sufficient conditions for a convex hull result that immediately implies that the standard semidefinite program (SDP) relaxation of a QCQP is tight. We begin by outlining a general framework for proving such … Read more
We decompose the copositive cone $\copos{n}$ into a disjoint union of a finite number of open subsets $S_{\cal E}$ of algebraic sets $Z_{\cal E}$. Each set $S_{\cal E}$ consists of interiors of faces of $\copos{n}$. On each irreducible component of $Z_{\cal E}$ these faces generically have the same dimension. Each algebraic set $Z_{\cal E}$ is … Read more
The problem of finding global minima of nonlinear discrete functions arises in many fields of practical matters. In recent years, methods based on discrete filled functions become popular as ways of solving these sort of problems. However, they rely on the steepest descent method for local searches. Here we present an approach that does not … Read more
A major difficulty in optimization with nonconvex constraints is to find feasible solutions. As simple examples show, the alphaBB-algorithm for single-objective optimization may fail to compute feasible solutions even though this algorithm is a popular method in global optimization. In this work, we introduce a filtering approach motivated by a multiobjective reformulation of the constrained … Read more
An important aspect of optimization algorithms, for instance evolutionary algorithms, are termination criteria that measure the proximity of the found solution to the optimal solution set. A frequently used approach is the numerical verification of necessary optimality conditions such as the Karush-Kuhn-Tucker (KKT) conditions. In this paper, we present a proximity measure which characterizes the … Read more
We provide a complete classification of the extreme rays of the $6 \times 6$ copositive cone ${\cal COP}^6$. We proceed via a coarse intermediate classification of the possible minimal zero support set of an exceptional extremal matrix $A \in {\cal COP}^6$. To each such minimal zero support set we construct a stratified semi-algebraic manifold in … Read more
Quadratically constrained quadratic programs (QCQPs) are a fundamental class of optimization problems well-known to be NP-hard in general. In this paper we study conditions under which the standard semidefinite program (SDP) relaxation of a QCQP is tight. We begin by outlining a general framework for proving such sufficient conditions. Then using this framework, we show … Read more
A popular belief for explaining the efficiency in training deep neural networks is that over-paramenterized neural networks have nice landscape. However, it still remains unclear whether over-parameterized neural networks contain spurious local minima in general, since all current positive results cannot prove non-existence of bad local minima, and all current negative results have strong restrictions … Read more