A randomized method for smooth convex minimization, motivated by probability maximization

We propose a randomized gradient method – or a randomized cutting-plane method from a dual viewpoint. From the primal viewpoint, our method bears a resemblance to the stochastic approximation family. But in contrast to stochastic approximation, the present method builds a model problem. Citation Kecskemet College, Pallasz Athene University. Izsaki ut 10, 6000 Kecskemet, Hungary; … Read more

Optimal scenario generation and reduction in stochastic programming

Scenarios are indispensable ingredients for the numerical solution of stochastic optimization problems. Earlier approaches for optimal scenario generation and reduction are based on stability arguments involving distances of probability measures. In this paper we review those ideas and suggest to make use of stability estimates based on distances containing minimal information, i.e., on data appearing … Read more

An Active-Set Algorithmic Framework for Non-Convex Optimization Problems over the Simplex

In this paper, we describe a new active-set algorithmic framework for minimizing a non-convex function over the unit simplex. At each iteration, the method makes use of a rule for identifying active variables (i.e., variables that are zero at a stationary point) and specific directions (that we name active-set gradient related directions) satisfying a new … Read more

Polynomial-Time Methods to Solve Unimodular Quadratic Programs With Performance Guarantees

We develop polynomial-time heuristic methods to solve unimodular quadratic programs (UQPs) approximately, which are known to be NP-hard. In the UQP framework, we maximize a quadratic function of a vector of complex variables with unit modulus. Several problems in active sensing and wireless communication applications boil down to UQP. With this motivation, we present three … Read more

Direct Search Methods on Reductive Homogeneous Spaces

Direct search methods are mainly designed for use in problems with no equality constraints. However, there are many instances where the feasible set is of measure zero in the ambient space and no mesh point lies within it. There are methods for working with feasible sets that are (Riemannian) manifolds, but not all manifolds are … Read more

Symmetric ADMM with Positive-Indefinite Proximal Regularization for Linearly Constrained Convex Optimization

The proximal ADMM which adds proximal regularizations to ADMM’s subproblems is a popular and useful method for linearly constrained separable convex problems, especially its linearized case. A well-known requirement on guaranteeing the convergence of the method in the literature is that the proximal regularization must be positive semidefinite. Recently it was shown by He et … Read more

Complex Number Formulation and Convex Relaxations for Aircraft Conflict Resolution

We present a novel complex number formulation along with tight convex relaxations for the aircraft conflict resolution problem. Our approach combines both speed and heading control and provides global optimality guarantees despite non-convexities in the feasible region. As a side result, we present a new characterization of the conflict separation condition in the form of … Read more

A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications

For a symmetric positive semidefinite linear system of equations $\mathcal{Q} {\bf x} = {\bf b}$, where ${\bf x} = (x_1,\ldots,x_s)$ is partitioned into $s$ blocks, with $s \geq 2$, we show that each cycle of the classical block symmetric Gauss-Seidel (block sGS) method exactly solves the associated quadratic programming (QP) problem but added with an … Read more

On Procrustes matching of non-negative matrices and an application to random tomography

We consider a Procrustes matching problem for non-negative matrices that arose in random tomography. As an alternative to the Frobenius distance, we propose an alternative non-symmetric distance using generalized inverses. Among its advantages is that it leads to a relatively simple quadratic function that can be optimized with least-square methods on manifolds. Citation Accepted for … Read more