A Binarisation Heuristic for Non-Convex Quadratic Programming with Box Constraints

Non-convex quadratic programming with box constraints is a fundamental problem in the global optimization literature, being one of the simplest NP-hard nonlinear programs. We present a new heuristic for this problem, which enables one to obtain solutions of excellent quality in reasonable computing times. The heuristic consists of four phases: binarisation, convexification, branch-and-bound, and local … Read more

First-Order Algorithms for Convex Optimization with Nonseparate Objective and Coupled Constraints

In this paper we consider a block-structured convex optimization model, where in the objective the block-variables are nonseparable and they are further linearly coupled in the constraint. For the 2-block case, we propose a number of first-order algorithms to solve this model. First, the alternating direction method of multipliers (ADMM) is extended, assuming that it … Read more

AN ASYMPTOTIC VISCOSITY SELECTION RESULT FOR THE REGULARIZED NEWTON DYNAMIC

Let $\Phi:\mathcal{H}\longrightarrow\mathbb{R\cup}\left\{ +\infty\right\} $ be a closed convex proper function on a real Hilbert space $\mathcal{H}$, and $\partial\Phi:\mathcal{H}\rightrightarrows\mathcal{H}$ its subdifferential. For any control function $\epsilon:\mathbb{R}_{+}\longrightarrow\mathbb{R}_{+}$ which tends to zero as $t$ goes to $+\infty$, and $\lambda$ a positive parameter, we study the asymptotic behavior of the trajectories of the regularized Newton dynamical system \begin{eqnarray*} & … Read more

A second-order sequential optimality condition associated to the convergence of optimization algorithms

Sequential optimality conditions have recently played an important role on the analysis of the global convergence of optimization algorithms towards first-order stationary points and justifying their stopping criteria. In this paper we introduce the first sequential optimality condition that takes into account second-order information. We also present a companion constraint qualification that is less stringent … Read more

Strong SOCP Relaxations for the Optimal Power Flow Problem

This paper proposes three strong second order cone programming (SOCP) relaxations for the AC optimal power flow (OPF) problem. These three relaxations are incomparable to each other and two of them are incomparable to the standard SDP relaxation of OPF. Extensive computational experiments show that these relaxations have numerous advantages over existing convex relaxations in … Read more

A proximal gradient method for ensemble density functional theory

The ensemble density functional theory is valuable for simulations of metallic systems due to the absence of a gap in the spectrum of the Hamiltonian matrices. Although the widely used self-consistent field iteration method can be extended to solve the minimization of the total energy functional with respect to orthogonality constraints, there is no theoretical … Read more

Bridging the Gap Between Multigrid, Hierarchical, and Receding-Horizon Control

We analyze the structure of the Euler-Lagrange conditions of a lifted long-horizon optimal control problem. The analysis reveals that the conditions can be solved by using block Gauss-Seidel schemes and we prove that such schemes can be implemented by solving sequences of short-horizon problems. The analysis also reveals that a receding-horizon control scheme is equivalent … Read more

Stochastic Optimization using a Trust-Region Method and Random Models

In this paper, we propose and analyze a trust-region model-based algorithm for solving unconstrained stochastic optimization problems. Our framework utilizes random models of an objective function $f(x)$, obtained from stochastic observations of the function or its gradient. Our method also utilizes estimates of function values to gauge progress that is being made. The convergence analysis … Read more

Lower Bounds on Complexity of Lyapunov Functions for Switched Linear Systems

We show that for any positive integer $d$, there are families of switched linear systems—in fixed dimension and defined by two matrices only—that are stable under arbitrary switching but do not admit (i) a polynomial Lyapunov function of degree $\leq d$, or (ii) a polytopic Lyapunov function with $\leq d$ facets, or (iii) a piecewise … Read more

New results on subgradient methods for strongly convex optimization problems with a unified analysis

We develop subgradient- and gradient-based methods for minimizing strongly convex functions under a notion which generalizes the standard Euclidean strong convexity. We propose a unifying framework for subgradient methods which yields two kinds of methods, namely, the Proximal Gradient Method (PGM) and the Conditional Gradient Method (CGM), unifying several existing methods. The unifying framework provides … Read more