A novel algorithm for a broad class of nonconvex optimization problems

In this paper, we propose a new global optimization approach for solving nonconvex optimization problems in which the nonconvex components are sums of products of convex functions. A broad class of nonconvex problems can be written in this way, such as concave minimization problems, difference of convex problems, and fractional optimization problems. Our approach exploits … Read more

Asynchronous Iterations in Optimization: New Sequence Results and Sharper Algorithmic Guarantees

We introduce novel convergence results for asynchronous iterations that appear in the analysis of parallel and distributed optimization algorithms. The results are simple to apply and give explicit estimates for how the degree of asynchrony impacts the convergence rates of the iterates. Our results shorten, streamline and strengthen existing convergence proofs for several asynchronous optimization … Read more

A Moment-SOS Hierarchy for Robust Polynomial Matrix Inequality Optimization with SOS-Convexity

We study a class of polynomial optimization problems with a robust polynomial matrix inequality constraint for which the uncertainty set is defined also by a polynomial matrix inequality (including robust polynomial semidefinite programs as a special case). Under certain SOS-convexity assumptions, we construct a hierarchy of moment-SOS relaxations for this problem to obtain convergent upper … Read more

Optimal Low-Rank Matrix Completion: Semidefinite Relaxations and Eigenvector Disjunctions

Low-rank matrix completion consists of computing a matrix of minimal complexity that recovers a given set of observations as accurately as possible. Unfortunately, existing methods for matrix completion are heuristics that, while highly scalable and often identifying high-quality solutions, do not possess any optimality guarantees. We reexamine matrix completion with an optimality-oriented eye. We reformulate … Read more

Heuristic methods for noisy derivative-free bound-constrained mixed-integer optimization

This paper discusses MATRS, a new matrix adaptation trust region strategy for solving noisy derivative-free mixed-integer optimization problems with simple bounds.  MATRS repeatedly cycles through five phases, mutation, selection, recombination, trust-region, and mixed-integer in this order. But if in the mutation phase a new best point (point with lowest inexact function value among all evaluated … Read more

Polyhedral Properties of RLT Relaxations of Nonconvex Quadratic Programs and Their Implications on Exact Relaxations

We study linear programming relaxations of nonconvex quadratic programs given by the reformulation-linearization technique (RLT), referred to as RLT relaxations. We investigate the relations between the polyhedral properties of the feasible regions of a quadratic program and its RLT relaxation. We establish various connections between recession directions, boundedness, and vertices of the two feasible regions. … Read more

Convergence Analysis on A Data-deriven Inexact Proximal-indefinite Stochastic ADMM

In this paper, we propose an Inexact Proximal-indefinite Stochastic ADMM (abbreviated as IPS-ADMM) to solve a class of separable convex optimization problems whose objective functions consist of two parts: one is an average of many smooth convex functions and the other is a convex but potentially nonsmooth function. The involved smooth subproblem is tackled by … Read more

(ε-)Efficiency in Fractional Vector Optimization

The issue of characterizing completely efficient (Pareto) solutions to a fractional vector (multiobjective or multicriteria) minimization problem, where the involved functions are convex, has not been addressed previously. Thanks to an earlier characterization of weak efficiency in difference vector optimization by El Maghri, we get a vectorial necessary and sufficient condition given in terms of … Read more

Effective matrix adaptation strategy for noisy derivative-free optimization

In this paper, we introduce a new effective matrix adaptation evolution strategy (MADFO) for noisy derivative-free optimization problems. Like every MAES solver, MADFO consists of three phases: mutation, selection and recombination. MADFO improves the mutation phase by generating good step sizes, neither too small nor too large, that increase the probability of selecting mutation points … Read more

Equivalent Sufficient Conditions for Global Optimality of Quadratically Constrained Quadratic Program

\(\) We study the equivalence of several well-known sufficient optimality conditions for a general quadratically constrained quadratic program (QCQP). The conditions are classified in two categories. The first one is for determining an optimal solution and the second one is for finding an optimal value. The first category of conditions includes the existence of a … Read more