On the complexity of optimization over the standard simplex

We review complexity results for minimizing polynomials over the standard simplex and unit hypercube. In addition, we show that there exists a polynomial time approximation scheme (PTAS) for minimizing some classes of functions (including Lipschitz continuous functions) over the standard simplex. The main tools used in the analysis are Bernstein approximation and Lagrange interpolation on … Read more

A sum of squares approximation of nonnegative polynomials

We show that every real nonnegative polynomial $f$ can be approximated as closely as desired (in the $l_1$-norm of its coefficient vector) by a sequence of polynomials $\{f_\epsilon\}$ that are sums of squares. The novelty is that each $f_\epsilon$ has a simple and explicit form in terms of $f$ and $\epsilon$. Citation SIAM J. Optimization … Read more

SOS approximation of polynomials nonnegative on a real algebraic set

Let $V\subset R^n$ be a real algebraic set described by finitely many polynomials equations $g_j(x)=0,j\in J$, and let $f$ be a real polynomial, nonnegative on $V$. We show that for every $\epsilon>0$, there exist nonnegative scalars $\{\lambda_j\}_{j\in J}$ such that, for all $r$ sufficiently large, $f+\epsilon\theta_r+\sum_{j\in J} \lambda_j g_j^2$ is a sum of squares. Here, … Read more

New results for molecular formation under pairwise potential minimization

We establish new lower bounds on the distance between two points of a minimum energy configuration of $N$ points in $\mathbb{R}^3$ interacting according to a pairwise potential function. For the Lennard-Jones case, this bound is 0.67985 (and 0.7633 in the planar case). A similar argument yields an estimate for the minimal distance in Morse clusters, … Read more

Transposition theorems and qualification-free optimality conditions

New theorems of the alternative for polynomial constraints (based on the Positivstellensatz from real algebraic geometry) and for linear constraints (generalizing the transposition theorems of Motzkin and Tucker) are proved. Based on these, two Karush-John optimality conditions — holding without any constraint qualification — are proved for single- or multi-objective constrained optimization problems. The first … Read more

A linear programming reformulation of the standard quadratic optimization problem

The problem of minimizing a quadratic form over the standard simplex is known as the standard quadratic optimization problem (SQO). It is NP-hard, and contains the maximum stable set problem in graphs as a special case. In this note we show that the SQO problem may be reformulated as an (exponentially sized) linear program. Citation … Read more

A PTAS for the minimization of polynomials of fixed degree over the simplex

We consider the problem of computing the minimum value $p_{\min}$ taken by a polynomial $p(x)$ of degree $d$ over the standard simplex $\Delta$. This is an NP-hard problem already for degree $d=2$. For any integer $k\ge 1$, by minimizing $p(x)$ over the set of rational points in $\Delta$ with denominator $k$, one obtains a hierarchy … Read more

Jordan-algebraic aspects of nonconvex optimization over symmetric cones

We illustrate the usefulness of Jordan-algebraic technique for nonconvex optimization by considering a potential-reduction algorithm for a nonconvex quadratic function over the domain obtained as the intersection of a symmetric cone with an affine subspace Citation Preprint, September,2004 Article Download View Jordan-algebraic aspects of nonconvex optimization over symmetric cones

Constrained Global Optimization with Radial Basis Functions

Response surface methods show promising results for global optimization of costly non convex objective functions, i.e. the problem of finding the global minimum when there are several local minima and each function value takes considerable CPU time to compute. Such problems often arise in industrial and financial applications, where a function value could be a … Read more

Convergence Analysis of the DIRECT Algorithm

The DIRECT algorithm is a deterministic sampling method for bound constrained Lipschitz continuous optimization. We prove a subsequential convergence result for the DIRECT algorithm that quantifies some of the convergence observations in the literature. Our results apply to several variations on the original method, including one that will handle general constraints. We use techniques from … Read more