On the Information-Adaptive Variants of the ADMM: an Iteration Complexity Perspective

Designing algorithms for an optimization model often amounts to maintaining a balance between the degree of information to request from the model on the one hand, and the computational speed to expect on the other hand. Naturally, the more information is available, the faster one can expect the algorithm to converge. The popular algorithm of … Read more

Interior-point solver for convex separable block-angular problems

Constraints matrices with block-angular structures are pervasive in Optimization. Interior-point methods have shown to be competitive for these structured problems by exploiting the linear algebra. One of these approaches solved the normal equations using sparse Cholesky factorizations for the block constraints, and a preconditioned conjugate gradient (PCG) for the linking constraints. The preconditioner is based … Read more

Weak and Strong Superiorization: Between Feasibility-Seeking and Minimization

We review the superiorization methodology, which can be thought of, in some cases, as lying between feasibility-seeking and constrained minimization. It is not quite trying to solve the full fledged constrained minimization problem; rather, the task is to find a feasible point which is superior (with respect to an objective function value) to one returned … Read more

A Gentle, Geometric Introduction to Copositive Optimization

This paper illustrates the fundamental connection between nonconvex quadratic optimization and copositive optimization—a connection that allows the reformulation of nonconvex quadratic problems as convex ones in a unified way. We intend the paper for readers new to the area, and hence the exposition is largely self-contained. We focus on examples having just a few variables … Read more

Block-wise Alternating Direction Method of Multipliers with Gaussian Back Substitution for Multiple-block Convex Programming

We consider the linearly constrained convex minimization model with a separable objective function which is the sum of m functions without coupled variables, and discuss how to design an efficient algorithm based on the fundamental technique of splitting the augmented Lagrangian method (ALM). Our focus is the specific big-data scenario where m is huge. A … Read more

On the iterate convergence of descent methods for convex optimization

We study the iterate convergence of strong descent algorithms applied to convex functions. We assume that the function satisfies a very simple growth condition around its minimizers, and then show that the trajectory described by the iterates generated by any such method has finite length, which proves that the sequence of iterates converge. Citation Federal … Read more

Matrix monotonicity and self-concordance:how to handle quantum entropy in optimization problems

Let $g$ be a continuously differentiable function whose derivative is matrix monotone on positive semi-axis. Such a function induces a function $\phi (x)=tr(g(x))$ on the cone of squares of an arbitrary Euclidean Jordan algebra. We show that $\phi (x) -\ln \det(x)$ is a self-concordant function on the interior of the cone. We also show that … Read more

Convergence analysis of sampling-based decomposition methods for risk-averse multistage stochastic convex programs

We consider a class of sampling-based decomposition methods to solve risk-averse multistage stochastic convex programs. We prove a formula for the computation of the cuts necessary to build the outer linearizations of the recourse functions. This formula can be used to obtain an efficient implementation of Stochastic Dual Dynamic Programming applied to convex nonlinear problems. … Read more

Block-wise Alternating Direction Method of Multipliers for Multiple-block Convex Programming and Beyond

The alternating direction method of multipliers (ADMM) is a benchmark for solving a linearly constrained convex minimization model with a two-block separable objective function; and it has been shown that its direct extension to a multiple-block case where the objective function is the sum of more than two functions is not necessarily convergent. For the … Read more

Self Equivalence of the Alternating Direction Method of Multipliers

The alternating direction method of multipliers (ADM or ADMM) breaks a complex optimization problem into much simpler subproblems. The ADM algorithms are typically short and easy to implement yet exhibit (nearly) state-of-the-art performance for large-scale optimization problems. To apply ADM, we first formulate a given problem into the “ADM-ready” form, so the final algorithm depends … Read more