Quadratic regularization projected alternating Barzilai–Borwein method for constrained optimization

In this paper, based on the regularization techniques and projected gradient strategies, we present a quadratic regularization projected alternating Barzilai–Borwein (QRPABB) method for minimizing differentiable functions on closed convex sets. We show the convergence of the QRPABB method to a constrained stationary point for a nonmonotone line search. When the objective function is convex, we … Read more

On Augmentation Algorithms for Linear and Integer-Linear Programming: From Edmonds-Karp to Bland and Beyond

Motivated by Bland’s linear-programming generalization of the renowned Edmonds-Karp efficient refinement of the Ford-Fulkerson maximum-flow algorithm, we discuss three closely-related natural augmentation rules for linear and integer-linear optimization. In several nice situations, we show that polynomially-many augmentation steps suffice to reach an optimum. In particular, when using “discrete steepest-descent augmentations” (i.e., directions with the best … Read more

Fast Projection onto the Simplex and the l1 Ball

A new algorithm is proposed to project, exactly and in finite time, a vector of arbitrary size onto a simplex or a l1-norm ball. The algorithm is demonstrated to be faster than existing methods. In addition, a wrong statement in a paper by Duchi et al. is corrected and an adversary sequence for Michelot’s algorithm … Read more

Preconditioning of Active-Set Newton Methods for PDE-constrained Optimal Control Problems

We address the problem of preconditioning a sequence of saddle point linear systems arising in the solution of PDE-constrained optimal control problems via active-set Newton methods, with control and (regularized) state constraints. We present two new preconditioners based on a full block matrix factorization of the Schur complement of the Jacobian matrices, where the active-set … Read more

On Second Order Optimality Conditions in Nonlinear Optimization

In this work we present new weak conditions that ensure the validity of necessary second order optimality conditions (SOC) for nonlinear optimization. We are able to prove that weak and strong SOCs hold for all Lagrange multipliers using Abadie-type assumptions. We also prove weak and strong SOCs for at least one Lagrange multiplier imposing the … Read more

A tight iteration-complexity upper bound for the MTY predictor-corrector algorithm via redundant Klee-Minty cubes

It is an open question whether there is an interior-point algorithm for linear optimization problems with a lower iteration-complexity than the classical bound $\mathcal{O}(\sqrt{n} \log(\frac{\mu_1}{\mu_0}))$. This paper provides a negative answer to that question for a variant of the Mizuno-Todd-Ye predictor-corrector algorithm. In fact, we prove that for any $\epsilon >0$, there is a redundant … Read more

E. Lieb convexity inequalities and noncommutative Bernstein inequality in Jordan-algebraic setting

We describe a Jordan-algebraic version of E. Lieb convexity inequalities. A joint convexity of Jordan-algebraic version of quantum entropy is proven. SA spectral theory on semi-simple complex Jordan algebras is used as atool to prove the convexity results. Possible applications to optimization and statistics are indicated CitationPreprint, University of Notre Dame, August 2014ArticleDownload View PDF

Fast Algorithms for the Minimum Volume Estimator

The MVE estimator is an important tool in robust regression and outlier detection in statistics. We develop fast and efficient algorithms for the MVE estimator problem and discuss how they can be implemented efficiently. The novelty of our approach stems from the recent developments in the first-order algorithms for solving the related Minimum Volume Enclosing … Read more

Block-wise Alternating Direction Method of Multipliers for Multiple-block Convex Programming and Beyond

The alternating direction method of multipliers (ADMM) is a benchmark for solving a linearly constrained convex minimization model with a two-block separable objective function; and it has been shown that its direct extension to a multiple-block case where the objective function is the sum of more than two functions is not necessarily convergent. For the … Read more

Self Equivalence of the Alternating Direction Method of Multipliers

The alternating direction method of multipliers (ADM or ADMM) breaks a complex optimization problem into much simpler subproblems. The ADM algorithms are typically short and easy to implement yet exhibit (nearly) state-of-the-art performance for large-scale optimization problems. To apply ADM, we first formulate a given problem into the “ADM-ready” form, so the final algorithm depends … Read more